Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This package seems to no longer work; I'm getting these error messages #37

Open
cchorn opened this issue Oct 13, 2018 · 14 comments
Open

Comments

@cchorn
Copy link

cchorn commented Oct 13, 2018

image

@smorrel1
Copy link

+1 on this. Any help appreciated please :)

I wonder if Dukascopy started requesting a logon recently which broke this? The behaviour when going incognito supports this theory, i.e. when pressing 'Download' incognito mode for the first time you're asked to log on, but not on subsequent downloads. See this note on incognito.
res = await loop.run_in_executor(None, lambda: requests.get(url, stream=True)) in fetch.py line 24 doesn't send cookies. Here are the docs, but I've not found a way to send them yet.

However sometimes this code has worked for 1 day at a time, so I'm not sure....

@cjlv91
Copy link

cjlv91 commented Oct 21, 2018

Getting the same errors, anyone have potential solutions?

Anyone with sophisticated enough knowledge maybe can incorporate logins into their system? I.e. maybe bypassing the errors using accountlogin. I just registered an account (not trading account) solely to download the data, and it works wonderfully downloading directly from their website.

@smorrel1
Copy link

What about logging in manually then including the session cookies with the requests.get()?

@cjlv91
Copy link

cjlv91 commented Oct 21, 2018

Well it seems that the problem might lie at Dukascopy's own servers, I cannot download data for more than 1 day back on their website using proper login manually. For instance I try download Brent 15min data for 4months. Doesn't load, just stuck at "50%" loading for hours now. This seem to occur at the same time the python give those errors of no data fetched.

Might be something with their servers?

Is there a limit on how much data you can download from their website using proper login manually?

@smorrel1
Copy link

smorrel1 commented Oct 21, 2018

I could download data for 2003 in some pairs, and two weeks at a time from a month back. But I got asked to log in at apparently random intervals. Which means that the server is fine, but logging in is the problem, provided the downloads are no longer than say 10 days each. Possibly they introduced throttling as well as logon to thwart bulk downloads.

If anyone knows how to get cookies from Chrome’s SQL database then I can amend the code to send these and do a pull request.

@nick2012
Copy link

nick2012 commented Nov 5, 2018

@smorrel1 can you please implement it for firefox cookies? just add an argument for the cookiefile path. I've tried to do it myself but I'm not that advanced, and I could not make it to work. But I guess this is the way to do it: https://stackoverflow.com/questions/49502254/how-to-import-firefox-cookies-to-python-requests

@smorrel1
Copy link

smorrel1 commented Nov 5, 2018

Hi, after some investigation I might have missed diagnose the problem... you could try using JFOREX to download historic data quite easily instead, it’s what worked best for me.

@nick2012
Copy link

nick2012 commented Nov 5, 2018

@smorrel1 Can you please guide me on how to use JFOREX? I need stock data feed.

@milo-hyben
Copy link

Try to limit number of threads default is 10, I am using 2 threads and it seems to work, anything above 2 is throwing the Exception as on the first comment.
Example of working call:

"duka EURUSD -s 2017-01-01 -e 2017-12-31 -t 2"

@tijo45
Copy link

tijo45 commented May 8, 2019

@milo-hyben thanks! that worked for me.

@silviojaeger
Copy link

@nick2012 JFOREX workes fine to download historical data. Just open a demo account on dukascopy, download and install JFOREX and sign in with your demo account.
In the menubar click on "View" --> "Historical Data Manager".
On the popup window you can make your settings and download everything, it's pretty straight forward.
I downloaded the complet Dukascopy dataset in a few hours.

@hn2
Copy link

hn2 commented Aug 12, 2019

Does it still work for anyone? I get Request failed ... even with 1 thread

@alanpang1990301
Copy link

not working

@ipray4
Copy link

ipray4 commented Jun 14, 2021

Try to limit number of threads default is 10, I am using 2 threads and it seems to work, anything above 2 is throwing the Exception as on the first comment.
Example of working call:

"duka EURUSD -s 2017-01-01 -e 2017-12-31 -t 2"

wow it really works... Thanks a lot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants