Difference between revisions of "Alex's Elasticsearch Adventure"
Line 181: | Line 181: | ||
Size of /var/lib/elasticsearch/elasticsearch/nodes before any indexing: 2.3M | Size of /var/lib/elasticsearch/elasticsearch/nodes before any indexing: 2.3M | ||
− | Size of /var/lib/elasticsearch/elasticsearch/nodes after indexing 25,000 documents: | + | Size of /var/lib/elasticsearch/elasticsearch/nodes after indexing 25,000 documents: 8.2M |
Time to index: 0m4.618s | Time to index: 0m4.618s | ||
250,000 2,500,000 (Time and Memory) | 250,000 2,500,000 (Time and Memory) |
Revision as of 16:34, 29 September 2014
I have been working on getting a working Elasticsearch database populated with test data, in order to see what the system is capable of.
First, I went through all the steps at https://genomevolution.org/wiki/index.php/Install_Elasticsearch.
Next, I began looking into loading multiple JSON objects into Elasticsearch's system at once. Found useful information at http://httpkit.com/resources/HTTP-from-the-Command-Line/ under the heading "Use a File as a Request Body".
I created a JSON file (I called it sample1.json) that looked like this:
{ 1: {id: 1, type_name: "gene", start: 0, stop: 1, strand: "+", chromosome: 1 feature_name: { name1: "blah1", name2: "name", name3: "George", name4: "obligatory" } }, 2: { id: 2, type_name: "exon", start: 1776, stop: 2014, strand: "und", chromosome: 3 feature_name: { name1: "stuff", name2: "at4g37764", name3: "578926", name4: "name_of_feature" } }, 3: { id: 3, type_name: "cds", start: 1, stop: 4, strand: "-", chromosome: 2 feature_name: { name1: "stuff", name2: "at4g37764", name3: "578926", } } }
I then tested the command
curl -X PUT \ -H 'Content-Type: application/json' \ -d @sample1.json \ localhost:9200/testIndex/feature
and got a "No handler Found" error.
So, I tried reorganizing the command:
curl -XPUT localhost:9200/testIndex/feature -H 'Content-Type: application/json' -d @sample1.json
Same error:
No handler found for uri [/testIndex/feature] and method [PUT]
Tried again, actually specifying an _id field of "test1" this time (the 1,2, and 3, in the JSON file were supposed to be the _id fields:
curl -XPUT localhost:9200/testIndex/feature/test1 -H 'Content-Type: application/json' -d @sample1.json
Get yet another error:
{"error":"InvalidIndexNameException[[testIndex] Invalid index name [testIndex], must be lowercase]","status":400}
Alright, apparently in doesn't like the capital letter in "testIndex". In that case:
curl -XPUT localhost:9200/test_index/feature/test1 -H 'Content-Type: application/json' -d @sample1.json
Woo more errors!
{"error":"MapperParsingException[failed to parse]; nested: JsonParseException[Unexpected character ('}' (code 125)): was expecting either valid name character (for unquoted name) or double-quote (for quoted) to start field name\n at [Source: [B@23276b35; line: 1, column: 838]]; ","status":400}
Okay, it looks like I forgot to put quotes around my object labels in the JSON file. That's easy enough to fix. New sample1.json:
Run the command, get the same error. Looking at it more closely, the quotes may not have been the issue (though it probably didn't hurt to add them). It appears to be having issues with one of my closing brackets ( "}" ).
.....................
After talking to Matt, we figured out how the Bulk API is supposed to work (found at http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html).
So, new JSON file (sample2.json):
{ "create" : { "_id" : "one" } }\n { "feature1" : "This is the first feature" }\n { "create" : { "_id" : "two" } }\n { "feature2" : "This is the second feature" }\n
And new curl command (found the syntax at http://elasticsearch-users.115913.n3.nabble.com/How-to-index-a-JSON-file-td4033230.html):
curl -s -XPOST 'localhost:9200/testindex/feature/_bulk' --data-binary @sample2.json
Response from console:
{"took":302,"errors":false,"items":[{"create":{"_index":"testindex","_type":"feature","_id":"one","_version":1,"status":201}},{"create":{"_index":"testindex","_type":"feature","_id":"two","_version":1,"status":201}}]}
I'm assuming here that "errors:false" means it loaded without errors but just to be sure lets run:
curl localhost:9200/testindex/feature/one
And we get:
{"_index":"testindex","_type":"feature","_id":"one","_version":1,"found":true,"_source":{ "feature1" : "This is the first feature" }\n}
Yay! Although it looks like we didn't need the newlines on the actual entries, just the "create" commands. Not to worry though, the Bulk API page says "create will fail if a document with the same index and type exists already, whereas index will add or replace a document as necessary."
With that knowledge, let's edit sample2.json:
{ "index" : { "_id" : "one" } }\n { "feature1" : "This is the first feature" } { "index" : { "_id" : "two" } }\n { "feature2" : "This is the second feature" }
and run these one more time:
curl -s -XPOST 'localhost:9200/testindex/feature/_bulk' --data-binary @sample2.json curl localhost:9200/testindex/feature/one curl localhost:9200/testindex/feature/two
We get:
{"_index":"testindex","_type":"feature","_id":"one","_version":2,"found":true,"_source":{ "feature1" : "This is the first feature" }} {"_index":"testindex","_type":"feature","_id":"two","_version":2,"found":true,"_source":{ "feature2" : "This is the second feature" }}
No more extraneous newlines, and the results look good!
..................................................................................................................................................................................................................................................................
I wrote a Java program called JSONGenerator.java to randomly generate 25000 feature objects which I then batch loaded into the index.
javac JSONGenerator.java && java JSONGenerator > generatorTest.json curl -s -XPOST 'localhost:9200/testindex/feature/_bulk' --data-binary @generatorTest.json
All elasticsearch data is located here:
/var/lib/elasticsearch/elasticsearch/nodes Current size (du -hs): 7.2M
Performance tests:
Test used for range query:
curl localhost:9200/testindex/feature/_search -d' { "query": { "and": [ { "range": { "start": { "gte": 400000 } } }, { "range": { "stop": { "lte": 500000 } } } ] } }'
Size of /var/lib/elasticsearch/elasticsearch/nodes before any indexing: 2.3M
Size of /var/lib/elasticsearch/elasticsearch/nodes after indexing 25,000 documents: 8.2M Time to index: 0m4.618s
250,000 2,500,000 (Time and Memory)