Replies: 12 comments 1 reply
-
now i can see cloud watch logs on hourly interval but i got below error
|
Beta Was this translation helpful? Give feedback.
-
@tiru1930 could you provide the code you used to produce this error here? |
Beta Was this translation helpful? Give feedback.
-
@icywang86rui Here is code
|
Beta Was this translation helpful? Give feedback.
-
I have noticed that when the endpoint was deployed the
|
Beta Was this translation helpful? Give feedback.
-
Yes I did
…On Sat, 19 Sep 2020, 02:52 icywang86rui, ***@***.***> wrote:
I have noticed that when the endpoint was deployed the csv_content_types
is not specified in your DataCaptureConfig. Have you tried adding it?
xgb_predictor = framework_xgb.deploy(initial_instance_count=1,
instance_type='ml.m4.xlarge',
endpoint_name=endpoint_name,
data_capture_config=DataCaptureConfig(enable_capture=True,
sampling_percentage=100,
destination_s3_uri='s3://{}/{}'.format(bucket, data_capture_prefix)
)
)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1896 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AC5FIV7KN4OPGKI5IZ7FWL3SGPFRFANCNFSM4RO2BOFA>
.
|
Beta Was this translation helpful? Give feedback.
-
I see that you have configured DataCaptureConfig but csv_content_types is missing here. |
Beta Was this translation helpful? Give feedback.
-
@tiru1930 , Did you have any progress here ? |
Beta Was this translation helpful? Give feedback.
-
no luck , @
lsabreu96
…On Thu, Jan 21, 2021 at 6:50 PM lsabreu96 ***@***.***> wrote:
@tiru1930 <https://github.com/tiru1930> , Did you have any progress here ?
I'm having the same issue, but my error says that the encodingInput is
JSON and the encoding Output is CSV. I've tried setting csv_content_types
for both text/csv and for application/json, but none worked.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1896 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AC5FIV6Y6CQGGSR3NS7VK7TS3AS3TANCNFSM4RO2BOFA>
.
|
Beta Was this translation helpful? Give feedback.
-
@tiru1930 , sorry for bothering you again, but I couldn't find anywhere in our code, but did you change somehow how XGBoost is returning your data ? |
Beta Was this translation helpful? Give feedback.
-
@icywang86rui , could you help us out here, please ? |
Beta Was this translation helpful? Give feedback.
-
Having the same issue with model monitor executions. When listing the latest model monitor executions with
I'm finding an ExitMessage:
I've tried setting the content type
and the baseline job is CSV as well:
but the endpoint output must still be giving back JSON to the model monitor processing job given the ExitMessage and the datacapture found on s3:
This is essentially the same issue I'm having as well: https://stackoverflow.com/questions/65836468/aws-sagemaker-customererror-encoding-mismatch-when-monitoring-input |
Beta Was this translation helpful? Give feedback.
-
anyone solved this issue ? |
Beta Was this translation helpful? Give feedback.
-
I have created deployed xgboost framework service, i was able to hit the endpoint with sagemaker runtime session, Now i have updated the xgboost endpoint to capture the i/o data, and created base line and Scheduled monitor job. After that i have invoked the service but monitoring execution summaries is empty, an i have not seen any s3 bucket also.
To reproduce
1.Create xgboost framework deployment.
2.updated endpoint with data capture logs
3.Create base line
4.schedule monitor job
5.invoke the endpoint
Expected behavior
I should get monitoring results
Screenshots or logs
results
System information
A description of your system. Please provide:
Beta Was this translation helpful? Give feedback.
All reactions