Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fetch size parameter disscuss. #82

Open
yymoxiaochi opened this issue Sep 10, 2020 · 11 comments
Open

Fetch size parameter disscuss. #82

yymoxiaochi opened this issue Sep 10, 2020 · 11 comments

Comments

@yymoxiaochi
Copy link

Hi,Erdem
As far as I know,when fetchSize > 1, but sql is 1, logminer will not be able to capture this redo sql due to DB fetch size.
But, when fetchSize = 1, the number of SQL is particularly high,Connector captures very slowly. Is there any way to solve this problem? Like Kafka produce's configuration:"batch.size" and "linger.ms" .
Thanks.

@erdemcer
Copy link
Owner

Hi,
As you said , when fetch size >1 , connector waits until number of records which is equal to fetch size come from result set which is Oracle logminer view query.
As i understand you would like to have ability to capture data with higher fetch size within some specific duration even number of records for fetching not acquired. Am i right ?
Thanks

@yymoxiaochi
Copy link
Author

yymoxiaochi commented Sep 11, 2020

yes, you are right. If there only have 10 records and fetch size is 100, I want to be able to fetch the 10 records immediately, or return the data currently captured before the fetch size reaches the specified time.

@yymoxiaochi
Copy link
Author

Hi, is there any solution for this ?

@erdemcer
Copy link
Owner

Hi,
It can be possible , but it requires development of course . I am planning some details to achieve this.
Thanks.

@yymoxiaochi
Copy link
Author

Is there any general direction?
In my test, if DBMS_logmnr.start_logmnr start with endScn, I could capture all the SQL in this range.

@erdemcer
Copy link
Owner

What do yo mean by general direction ?

@yymoxiaochi
Copy link
Author

"I am planning some details to achieve this.", Could you tell me in what way you intend to solve this problem? Overridden some of the oralce JDBC implementation methods? :)

@erdemcer
Copy link
Owner

Not actually. I am planning to add some timer which controls specified duration and propagates data which are not sent to Kafka topic within JDBC fetch process.
Thanks.

@yymoxiaochi
Copy link
Author

yymoxiaochi commented Sep 15, 2020

Get it ,thanks, Look forward to your next update~~~

@taotaizhu-pw
Copy link

Get it ,thanks, Look forward to your next update~~~

Any update?

@taotaizhu-pw
Copy link

Not actually. I am planning to add some timer which controls specified duration and propagates data which are not sent to Kafka topic within JDBC fetch process. Thanks.

Any update about it? I found the performance is not good to fetch the data one by one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants