Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem working with Kafka messages with Record keys in Zerocode version 1.3.28 and later #557

Closed
Tracked by #681
bfarrell-ibm opened this issue Jan 16, 2023 · 19 comments

Comments

@bfarrell-ibm
Copy link

bfarrell-ibm commented Jan 16, 2023

Hi.

I have been executing kafka tests on older versions of kafka without having any issues but recently tried upgrading to more recent version - but have hit a problem when kafka messages with a key (containing dashes) is present.

ava.lang.RuntimeException: ZeroCode Step execution failed. Details:java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('a' (code 97)): Expected space separating root-level values at [Source: (String)"326e9aff-9767-46a9-a5e9-22c339389da7"; line: 1, column: 7]

The record has uses a request id as the key

Record Key - 326e9aff-9767-46a9-a5e9-22c339389da7 , Record value - {"request_id": "326e9aff-9767-46a9-a5e9-22c339389da7" .... }

The messages are processed fine with 1.3.27 - so something must have changed in the 1.3.28 version which is leading to this exception.

Could anyone advise what might be the problem?

I found something online which possibly might be related (similar error returned)
https://discuss.aerospike.com/t/root-level-value-error-when-using-aerospike-inbound-kafka-connector-with-librdkafka/6903

@bfarrell-ibm bfarrell-ibm changed the title Problem working with Kafka messages with kafka keys in Zerocode version 1.3.28 and later Problem working with Kafka messages with Record keys in Zerocode version 1.3.28 and later Jan 16, 2023
@BharatBaweja
Copy link

Hi @bfarrell-ibm
Is this issue is still active?
I am interested in working on this issue

@bfarrell-ibm
Copy link
Author

@BharatBaweja

Yes - the problem is still there as far as I am aware. I have not retested in recent days - but assume nothing has changed since i reported the issue.

@JPLahoda
Copy link

JPLahoda commented Oct 4, 2023

Hi @bfarrell-ibm

I'm a third-year undergraduate Computer Science major and I'd like to help out with this issue or a similar one. Do you think that this is probably doable for me? I haven't looked at the code yet but I will soon.

@authorjapps
Copy link
Owner

Hi @bfarrell-ibm Is this issue is still active? I am interested in working on this issue

Hello, if you identified the problem and the fix, can you raise a PR?
or
If you are busy,
can you advise about the problem to @JPLahoda and about the fix(if already found) ?


@bfarrell-ibm , can you attach the request/response JSON scenario and the log output to the ticket description please?

@ENate
Copy link

ENate commented Nov 17, 2023

Hi @bfarrell-ibm is this issue was still open?

@bfarrell-ibm
Copy link
Author

Hi - sorry for the delayed response - i'm not working on this issue at present and needed to dig up some info.

It appears to still be an issue.
My test runs on up to zerocode version 1.3.27 - but after that the kafka records are not processed.

2023-11-17 12:13:38,667 [main] ERROR org.jsmart.zerocode.core.kafka.client.BasicKafkaClient - Exception during operation:CONSUME, topicName:AnsibleLifecycleDriver_lifecycle_request_queue, error:Unrecognized token 'f36eeb7f': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')

Where the record is as follows

Record Key - f36eeb7f-2ace-43b4-9467-fd3bbde08ea3 , Record value - {"request_id": "f36eeb7f-2ace-43b4-9467-fd3bbde08ea3", "lifecycle_name": "Stop", "driver_files": "UEsDBBQACAgIAAAAIQAAAAAAAAAAAAAAAAAgABEAY29uZmlnL2hvc3RfdmFycy9yZW1vdGUtaG9zdC55bWxVVA0ABwAAAAAAAAAAAAAAANPV1eVKzCvOTMpJjc/ILy6xUlCqrlYoKMovSC0qyUwt1itKzc0vgcgp1NYqwRWXFqcW4VIMkkNRXFycEV+QWFyMSwNIDqyBCwBQSwcI34+WwUkAAACQAAAAUEsDBBQACAgIAAAAIQAAAAAAAAAAAAAAAAAQABEAY29uZmlnL2ludmVudG9yeVVUDQAHAAAAAAAAAAAAAAAAiy5Kzc0vSY3lgtC6GfnFJVwAUEsHCB/HAbISAAAAFQAAAFBLAwQUAAgICAAAACEAAAAAAAAAAAAAAAAAEgARAHNjcmlwdHMvQWRvcHQueWFtbFVUDQAHAAAAAAAAAAAAAAAAXYwxDoAwDAN3XmGx9wPZWPgGCiW0Ei1BpPyflJHtbJ8ccHIVwrTp1QYgqzUjFI1cOnu1StSunOohcctyLzvHrs1cTLxtbIeRQ/jdAZusT6IPgWqJMH7j+AJQSwcIKwqrZFkAAAB6AAAAUEsDBBQACAgIAAAAIQAAAAAAAAAAAAAAAAAWABEAc2NyaXB0cy9Db25maWd1cmUueWFtbFVUDQAHAAAAAAAAAAAAAAAAbYwxDoAwDAN3XmGx9wNZkfgGCiUURGmkpv0/LQMT29k62yHxLYRJ036GmmUADrVihKieY+dWreK1a0lbCFwOycvOvmszR+urwnYZNXA/l8Amaw30InBbIIyfMD5QSwcIYtQ3ZFwAAACGAAAAUEsDBBQACAgIAAAAIQAAAAAAAAAAAAAAAAATABEAc2NyaXB0cy9DcmVhdGUueWFtbFVUDQAHAAAAAAAAAAAAAAAAXYw5DoAgEEV7T/FDzwWmNfEaZsARElkSBu8vaGf3/mpROAthbcJdFiBW7UpI1XOaPCwnvs5OqUME7lHafrKftY2TzlVnvZQG2P8fcIi7A70IZA0E86XmAVBLBwjknRJOWAAAAH0AAABQSwMEFAAICAgAAAAhAAAAAAAAAAAAAAAAABMAEQBzY3JpcHRzL0RlbGV0ZS55YW1sVVQNAAcAAAAAAAAAAAAAAABdjLkNgDAMRXum+KLPAq4RayAnmEQih4TN/iTQ0b1/OlQuQlgki8kEpKamhNwC58Hd8hLa6NTWRWRLcm0Hh1FbOetYGeup1MH9/4Bd/B3pRaBoJMxfOj9QSwcIpQ/M/1cAAAB9AAAAUEsDBBQACAgIAAAAIQAAAAAAAAAAAAAAAAAUABEAc2NyaXB0cy9JbnN0YWxsLnlhbWxVVA0ABwAAAAAAAAAAAAAAAGWMwQ2AIBAE/1ax8U8DV4CJVZgDT0g8IPGwf8H48ze7O1mHwlkIa7HGqhOQqjUjaA2sg3vlJdQhldpD5Jbk2g4OQ1tYTXrb2E6jDu53COzi70gvAtkiYf7m+QFQSwcID3II1VsAAACAAAAAUEsDBBQACAgIAAAAIQAAAAAAAAAAAAAAAAASABEAc2NyaXB0cy9TdGFydC55YW1sVVQNAAcAAAAAAAAAAAAAAABdjMENgDAMA/9MYfHvAhmABRgApSW0EoVITdiflie/s31ywM2XEFbn5hNQ1NwIVRPXwb2KknQot/aQ2Yu07eA0tIWrSW+d7TTqEH53wC7xyfQhcFkmzN84v1BLBwjWD7asWQAAAHoAAABQSwMEFAAICAgAAAAhAAAAAAAAAAAAAAAAABEAEQBzY3JpcHRzL1N0b3AueWFtbFVUDQAHAAAAAAAAAAAAAAAAVYzBDYAgEAT/VrHxTwNXgA1YgDnwhETgjIf9C/78ze5O1qFyEcLa9JqApNaMkDVwHtwrL0GHUbWHyC3JvR0chrZwNultYzuNOrj/G7CLfyJ9CBSLhHls8wtQSwcI9F9rrFkAAAB3AAAAUEsBAhQAFAAICAgAAAAhAN+PlsFJAAAAkAAAACAACQAAAAAAAAAAAAAAAAAAAGNvbmZpZy9ob3N0X3ZhcnMvcmVtb3RlLWhvc3QueW1sVVQFAAcAAAAAUEsBAhQAFAAICAgAAAAhAB/HAbISAAAAFQAAABAACQAAAAAAAAAAAAAAqAAAAGNvbmZpZy9pbnZlbnRvcnlVVAUABwAAAABQSwECFAAUAAgICAAAACEAKwqrZFkAAAB6AAAAEgAJAAAAAAAAAAAAAAAJAQAAc2NyaXB0cy9BZG9wdC55YW1sVVQFAAcAAAAAUEsBAhQAFAAICAgAAAAhAGLUN2RcAAAAhgAAABYACQAAAAAAAAAAAAAAswEAAHNjcmlwdHMvQ29uZmlndXJlLnlhbWxVVAUABwAAAABQSwECFAAUAAgICAAAACEA5J0STlgAAAB9AAAAEwAJAAAAAAAAAAAAAABkAgAAc2NyaXB0cy9DcmVhdGUueWFtbFVUBQAHAAAAAFBLAQIUABQACAgIAAAAIQClD8z/VwAAAH0AAAATAAkAAAAAAAAAAAAAAA4DAABzY3JpcHRzL0RlbGV0ZS55YW1sVVQFAAcAAAAAUEsBAhQAFAAICAgAAAAhAA9yCNVbAAAAgAAAABQACQAAAAAAAAAAAAAAtwMAAHNjcmlwdHMvSW5zdGFsbC55YW1sVVQFAAcAAAAAUEsBAhQAFAAICAgAAAAhANYPtqxZAAAAegAAABIACQAAAAAAAAAAAAAAZQQAAHNjcmlwdHMvU3RhcnQueWFtbFVUBQAHAAAAAFBLAQIUABQACAgIAAAAIQD0X2usWQAAAHcAAAARAAkAAAAAAAAAAAAAAA8FAABzY3JpcHRzL1N0b3AueWFtbFVUBQAHAAAAAFBLBQYAAAAACQAJAKQCAAC4BQAAAAA=", "system_properties": {"resourceManagerId": {"type": "string", "value": "brent"}, "resourceId": {"type": "string", "value": "406ad03b-400a-42e1-9bb2-ea26fa6ab5bc"}, "metricKey": {"type": "string", "value": "9e4ae98a-a45f-44dc-ad77-003ea90f3792"}, "requestId": {"type": "string", "value": "fcf42564-3894-4d13-a18e-8e2e19fc24f5"}, "resourceName": {"type": "string", "value": "CreateSimpleDummy1__dummy__1"}, "deploymentLocation": {"type": "string", "value": "RHOSP"}, "resourceType": {"type": "string", "value": "resource::dummy-vnfc-dummy::1.0"}}, "resource_properties": {}, "request_properties": {}, "associated_topology": {}, "deployment_location": {"objectGroupId": "bb111383-be83-486d-870f-fccde62ffbe5", "resourceManager": "brent", "name": "RHOSP", "type": "Openstack", "properties": {}}, "tenant_id": "ac0b2851-22e0-45d5-ba2a-4f4b8765f910", "logging_context": {"tracectx.tenantid": "ac0b2851-22e0-45d5-ba2a-4f4b8765f910", "tracectx.processid": "a1cb8b8b-0e58-40a1-aa40-3e66f3a0ab85", "tracectx.transactionid": "6db673e7-e6e6-40db-b026-6a186e979186", "tracectx.taskid": "6"}}, Record partition - 0, Record offset - 10, Headers - RecordHeaders(headers = [], isReadOnly = false)

I tried on the latest ZC version - 1.3.35 - and it is still the same
Zerocodeversion1.3.23.log
Zerocodeversion1.3.28.log
Zerocodeversion1.3.35.log

@ENate
Copy link

ENate commented Nov 17, 2023

Okay no worries. I am checking the issues list to see which of those I can work on :)

@omkar-shitole
Copy link
Collaborator

omkar-shitole commented Sep 16, 2024

Hi,

Is this issue still open? I would like to look into it.

I’ve been reviewing the repo over the past few days and found some relevant details about this issue. To allow JsonComsume to read AVRO message functionality, some changes were made to the ConsumerJsonRecord class. Previously, before version 1.3.28, there were two key variables in this class: key (of generic type) and jsonKey (of type JsonNode). Starting from version 1.3.28, only the key (of type JsonNode) remains in this class.

In the toString() method of class ConsumerJsonRecord , the jsonKey of type JsonNode is returned with the '\'' character.
IMHO, It appears that the record key is nothing but the UUID. So need to check how this condition can be handled.

I have some local changes that address this parsing error, and I can raise a pull request for them.

Additionally, I have a fix for a test case related to KafkaFilter, specifically testConsumerFilterByJsonPath(), which was failing with a JsonMappingException. The error is:

Unexpected character java.lang.RuntimeException: com.fasterxml.jackson.databind.JsonMappingException: Unexpected character ('t' (code 116)): was expecting double-quote to start field name
 at [Source: (String)"[{topic=demo-p6, partition=0, offset=2, tim.........

Let me know if you would like me to proceed with these changes.

@bfarrell-ibm
Copy link
Author

Hi - I believe the issue is still present - i am not aware of any other changes that have gone in which may have resolved it. If you have fix to propose I'd suggest implenting it.
rgds,
Brendan

@omkar-shitole
Copy link
Collaborator

Yes sure,
I have raised a PR related to this issue.
You may find it below:
#681

@nirmalchandra
Copy link
Collaborator

nirmalchandra commented Sep 17, 2024

Thanks for the PR @omkar-shitole .
Here the CI build is in progress: https://github.com/authorjapps/zerocode/pull/681/checks

@nirmalchandra
Copy link
Collaborator

Hello @omkar-shitole ,
Once CI turns Green, can you explain the problem in 1 or 2 lines? Then explain about the fix in 1 or 2 lines only. That's it.
This is for all collaborator's awareness and to fix any future problems if arises with the Kafka consumer.

Thanks for understanding 👋

@omkar-shitole
Copy link
Collaborator

Hello,
I've investigated the JsonParseException issue ("Unrecognized token 'd1e778d8'...") by analyzing version changes in ConsumerJsonRecord:

Version comparison:
v1.3.27: ConsumerJsonRecord handled two key types: 'key' (Object) and 'jsonKey' (JsonNode).
v1.3.28 and later: Refactored to use a single universal 'key' of type JsonNode.

Root cause: The error occurs when an Object type key is passed to ConsumerJsonRecord in v1.3.28, which expects a JsonNode adb raise exception.
This PR modifies the ConsumerJsonRecord constructor to accept a key of type Object. It then converts the key to JsonNode type, ensuring consistent behavior across different key types. This approach addresses the backward compatibility issue while maintaining the desired JsonNode representation internally.

@omkar-shitole
Copy link
Collaborator

omkar-shitole commented Sep 17, 2024

Correction to above: Previously, I missed the fact that there is a newly added method isKeyParseableAsJson() in the KafkaConsumerHelper which already addresses the root cause and exception mentioned in version 1.3.35:
So, IMHO changes proposed in the ConsumerJsonRecord PR may no longer be necessary. However, the unit tests included in the PR could still be beneficial with some corrections.

I would kindly suggest that @bfarrell-ibm ensure the record keys are functioning as expected. Once this is confirmed, it seems the ticket could be considered resolved.

@nirmalchandra
Copy link
Collaborator

ok. Understood @omkar-shitole. Thanks for explaining this. I think, it's 1.3.43 (not 1.3.35) version onwards has got this "isKeyParseableAsJson()" method. But you're right, let's wait for "@bfarrell-ibm" to confirm if this resolves the issue.

We can do couple of things now:

  1. Keep this PR parked (don't delete it for now, let it be opened for few more weeks until we are fully sure about the fix)
  2. Raise a new PR with the Unit tests(also if you would like cover with few more tests, feel free to do that)
  3. @bfarrell-ibm : Can you pls use 1.3.44 (latest as of 17/Sep/24) version, and confirm that your issue has been resolved?
    -- Release(1.3.43) Details here: https://github.com/authorjapps/zerocode/releases/tag/1.3.43
    -- Once you confirm, we will act accordingly

@omkar-shitole
Copy link
Collaborator

omkar-shitole commented Sep 18, 2024

ok. Understood @omkar-shitole. Thanks for explaining this. I think, it's 1.3.43 (not 1.3.35) version onwards has got this "isKeyParseableAsJson()" method. But you're right, let's wait for "@bfarrell-ibm" to confirm if this resolves the issue.

We can do couple of things now:

  1. Keep this PR parked (don't delete it for now, let it be opened for few more weeks until we are fully sure about the fix)
  2. Raise a new PR with the Unit tests(also if you would like cover with few more tests, feel free to do that)
  3. @bfarrell-ibm : Can you pls use 1.3.44 (latest as of 17/Sep/24) version, and confirm that your issue has been resolved?
    -- Release(1.3.43) Details here: https://github.com/authorjapps/zerocode/releases/tag/1.3.43
    -- Once you confirm, we will act accordingly

Yes. By referring to version 1.3.35, I mean to highlight a particular exception observed in the attached log file of the error stack in #557 (comment)
belongs to 1.3.35 is handled by this isKeyParseableAsJson() method inclusion in 1.3.43 .
This exception was thrown at the reaTree() method when no appropriate string format is passed in the argument. For such cases, writeValueAsString() is a way to deal more adequately.
I'll keep PR pending and will raise a new one for test case inclusion to cover such cases.

@bfarrell-ibm
Copy link
Author

bfarrell-ibm commented Sep 18, 2024

Hi.

Running on 1.3.43 appears to have a fix in for the issue. Running on 1.3.44 is also fine.

Looking at the closed issues - i think the following issue was similar to my one.

#657

Some entries in logs now when running the test.

2024-09-18 11:32:12,265 [main] INFO  org.jsmart.zerocode.core.kafka.helper.KafkaConsumerHelper - >>>The key was not in a parsable JSON format:b98bd601-f442-482c-8197-bb1caa93318d
2024-09-18 11:32:12,266 [main] INFO  org.jsmart.zerocode.core.kafka.helper.KafkaConsumerHelper - >>>Converting the key to JSON format for to able to read it

Thanks to everyone for you assistance. I think this issue can be closed.

@authorjapps
Copy link
Owner

authorjapps commented Sep 18, 2024

Thanks @bfarrell-ibm for the confirmation. Appreciate you've captured the log-entries too as evidence 👍 .
Thanks everyone for your analysis and taking time in looking into it.

Closing this issue now. 🔐

Improvement PRs related to unit test coverages are welcome 👋 . When ready, please watch the CI status and tag the collaborators.


@omkar-shitole , added you as collaborator now for you to review other PRs and enable/approve CI builds in this repo.
Please accept this invitation when you get chance.

@omkar-shitole
Copy link
Collaborator

Thanks @bfarrell-ibm for the confirmation. Appreciate you've captured the log-entries too as evidence 👍 . Thanks everyone for your analysis and taking time in looking into it.

Closing this issue now. 🔐

Improvement PRs related to unit test coverages are welcome 👋 . When ready, please watch the CI status and tag the collaborators.

@omkar-shitole , added you as collaborator now for you to review other PRs and enable/approve CI builds in this repo. Please accept this invitation when you get chance.

Thank you so much for adding me as a collaborator! I’ve accepted the invitation and will start reviewing the PRs and enabling/approving CI builds as needed.
About unit test coverage work is in progress. I'll raise a PR once it is finished.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants