You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Steps
1. Created an external table that points to the gzip log files
2. Select query with limit 10 or 100 returned results.
3. Now created a secondary external table that's pointing to a different
location.
4. Used Insert Overwrite clause to pull out records from a certain day/month
into a partition.
5. the select statement succeeds but file creation fails
What is the expected output? What do you see instead?
a flat table in text file format. The job fails with following error .
=====================================
ERROR="java\.lang\.RuntimeException:
org\.apache\.hadoop\.hive\.ql\.metadata\.HiveException: Hive Runtime Error while
processing row
{\"atype\":\"type1\",\"operation\":\"orize\",\"status\":\"Allow\",\"tme\":156\.25900000000001,\"starttime\":\"/Date(1314981024895)/\",\"remoteip\":\"x\.y\.
z\.t,
x1\.y1\.z1\.t1\",\"requesturi\":\"uri\",\"userid\":\"x\",\"eidmid\":\"y\",\"userlanguage\":\"en\",\"
usercountry\":\"US\",\"mode\":\"normal_mode\",\"servicekey\":\"key1\",\"consumerkey\":\"ke2\",\"line\":null,\"number\":null,\"d
t\":\"2011\.09\.02\"} at org\.apache\.hadoop\.hive\.ql\.exec\.ExecMapper\.map(ExecMapper\.java:161) at org\.apache\.hadoop\.mapred\.MapRunner\.run(MapRunner\.java:50) at
org\.apache\.hadoop\.mapred\.MapTask\.runOldMapper(MapTask\.java:363) at org\.apache\.hadoop\.mapred\.MapTask\.run(MapTask\.java:312) at
org\.apache\.hadoop\.mapred\.Child\.main(Child\.java:170) Caused by: org\.apache\.hadoop\.hive\.ql\.metadata\.HiveException: Hive Runtime Error while processing row
{\"atype\":\"type1\",\"operation\":\"orize\",\"status\":\"Allow\",\"tme\":156\.25900000000001,\"starttime\":\"/Date(1314981024895)/\",\"remoteip\":\"x\.y\.
z\.t,
x1\.y1\.z1\.t1\",\"requesturi\":\"uri\",\"userid\":\"x\",\"eidmid\":\"y\",\"userlanguage\":\"en\",\"
usercountry\":\"US\",\"mode\":\"normal_mode\",\"servicekey\":\"key1\",\"consumerkey\":\"ke2\",\"line\":null,\"number\":null,\"d
t\":\"2011\.09\.02\"} at org\.apache\.hadoop\.hive\.ql\.exec\.MapOperator\.process(MapOperator\.java:483) at
org\.apache\.hadoop\.hive\.ql\.exec\.ExecMapper\.map(ExecMapper\.java:143) \.\.\. 4 more Caused by: java\.lang\.NullPointerException at
org\.apache\.hadoop\.hive\.ql\.io\.HiveIgnoreKeyTextOutputFormat$1\.write(HiveIgnoreKeyTextOutputFormat\.java:97) at
org\.apache\.hadoop\.hive\.ql\.exec\.FileSinkOperator\.processOp(FileSinkOperator\.java:606) at org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.process(Operator\.java:470) at
org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.forward(Operator\.java:743) at org\.apache\.hadoop\.hive\.ql\.exec\.SelectOperator\.processOp(SelectOperator\.java:84) at
org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.process(Operator\.java:470) at org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.forward(Operator\.java:743) at
org\.apache\.hadoop\.hive\.ql\.exec\.FilterOperator\.processOp(FilterOperator\.java:87) at org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.process(Operator\.java:470) at
org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.forward(Operator\.java:743) at org\.apache\.hadoop\.hive\.ql\.exec\.FilterOperator\.processOp(FilterOperator\.java:87) at
org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.process(Operator\.java:470) at org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.forward(Operator\.java:743) at
org\.apache\.hadoop\.hive\.ql\.exec\.TableScanOperator\.processOp(TableScanOperator\.java:77) at org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.process(Operator\.java:470) at
org\.apache\.hadoop\.hive\.ql\.exec\.Operator\.forward(Operator\.java:743) at org\.apache\.hadoop\.hive\.ql\.exec\.MapOperator\.process(MapOperator\.java:466) \.\.\. 5 more " .
=====================================
What version of the product are you using? On what operating system?
Mac OSX, Amazon EMR/Hive, --hadoop-version 0.20 --hive-interactive
--hive-versions 0.7, hive-json-serde-0.2.jar
Please provide any additional information below.
I am also having additional issues with null values in columns. But probably
open a new issue
Original issue reported on code.google.com by [email protected] on 4 Oct 2011 at 1:23
The text was updated successfully, but these errors were encountered:
Original issue reported on code.google.com by
[email protected]
on 4 Oct 2011 at 1:23The text was updated successfully, but these errors were encountered: