Quandl Documentation

Quandl: Bitcoin Prices and Charts - I find myself using this quite a bit. If you have any data, please send it over to them.

Quandl: Bitcoin Prices and Charts - I find myself using this quite a bit. If you have any data, please send it over to them. submitted by mastermind1228 to Bitcoin [link] [comments]





Anyone else notice that bitcoin is the first suggested example of data to search for on quandl.com?

Anyone else notice that bitcoin is the first suggested example of data to search for on quandl.com? submitted by fiat_sux2 to Bitcoin [link] [comments]

Just the Facts: Bitcoin Data on Quandl

Just the Facts: Bitcoin Data on Quandl submitted by BTCwarrior to Bitcoin [link] [comments]

New exportable Quandl Datasets and stats for PimpCash / Bitcoin Rates

New exportable Quandl Datasets and stats for PimpCash / Bitcoin Rates submitted by PimpCash to PimpCash [link] [comments]

The Sunday dip

I was interested whether the famous Sunday dip was a real thing or a delusion based on false perception reinforced by everyone talking about it.  
I choose Bitcoin as the underlying asset because of its unbroken and consistent market dominance, the availability of its price data over time and the general consensus that Bitcoin tends to move the direction of the whole market.  
I took the historical Bitcoin data from the Quandl and used R for the analysis.  
Plot1 shows daily Bitcoin performance over time broken down by weekday. Interestingly, Sunday starts off in the middle of the pack performance-wise before really starting to drop around mid 2017.  
Saturday and Friday on the other hand are the “leading days” in Terms of performance. Looking at the result I wondered if a simple “sell the Saturday dip” strategy would have resulted in a significantly better performance than just holding over the observed time frame.  
Plot2 shows the result of said strategy (selling at the close of each Saturday and buying at the close of each Sunday vs. simply buying and holding from day 1). It seems that this strategy is not really advisable. The amount of BTC drops over time and was only able to recover in late 2017 when the “Sunday dips” were seemingly becoming an actual thing.  
I can put the code in github if anyone is interested.
As requested by a user, I had a look at the performance of selling on Saturdays / Fridays and buying back on Sundays vs holding for various coins. plot_a and plot_b.
submitted by countbase to CryptoCurrency [link] [comments]

Historical minute-by-minute prices on bitcoin, (preferably with 2nd level prices and altcoin prices too)

I am looking for high-resolution historical prices on bitcoin and possibly altcoins. I want the most complete source out there, with the most exchanges, most pairs and historical order books, if possible. I don't mind paying for the data.
Here is what I discovered so far:
High-resolution data (better than hourly):
Hourly data (free):
Daily data
If you know any other source, please share.
submitted by johnturtle to BitcoinMarkets [link] [comments]

Lisk Highlights, April 16th 2019: Lisk Sidechain Project's tech featured on Hackernoon.com

Hello there. Here is my selection for today's highlights and interesting items within the Lisk ecosystem and beyond.....

Lisk Sidechain Project's tech featured on Hackernoon.com

As a beginner, jumping into a new machine learning project can be overwhelming. The whole process starts with picking a data set, and second of all, study the data set in order to find out which machine learning algorithm class or type will fit best on the set of data. An article published yesterday on Hackernoon.com by Michiel Mulders dealt with this initial stumbling block and provided a series of five example projects that will teach you how to use ML algorithms, tune them, and analyze the given data. The piece was titled "Top 5 Machine Learning Projects for Beginners".
GNY the project from the team bringing Machine Learning to Lisk in the form of the LML sidechain (Lisk Machine Learning) featured for its ability to offer businesses an affordable way to unlock hidden value in their data while maintaining tight security. As the article stated, "the machine learning platform is actually embedded within a blockchain, so a user’s data is protected from potential hacks". The inherent structure of a blockchain helps to control data consistency and allows a user to remain in control of their data.
Writing about the recent video demonstration of the GNY codebase, showcasing the first ever successful integration of a machine learning platform into a Dpos network, Michiel Mulders said that "this demo is a fun starter project for people who want to predict simple numbers, and the full platform launching this Summer should provide developers with much more power and customization." The video demonstration spoken about can be found HERE and its accompanying step-by step-instructions for how to set up the system on your computer are HERE.
Other projects and datasets included in the Hackernoon article included.....
The Movielens data sets for movie ratings from GroupLens, a research lab in the Department of Computer Science and Engineering at the University of Minnesota. Useful for building a recommender System in Python.
Quandl, whose source for financial, economic, and alternative datasets, serving investment professionals can be used to predict future prices using fundamental and technical indicators.
You can read the full Hackernoon article HERE. If you have the time please clap up the article to show your support for a future Lisk sidechain.
That's it for today's highlights.
These highlight posts also go out daily on the….
LISK Highlights exclusive Telegram group: https://t.me/LiskHighlights
LISK Highlights Twitter : https://twitter.com/HighlightsLisk
They are are also included in my weekly roundup on the LISK Highlights Medium account and the Bitcoin talk forum's LISK thread, so keep an eye out for them on these outlets also.
Keep the faith Liskers! 👍
submitted by John_Muck to Lisk [link] [comments]

Largest number of coin days destroyed?

Just out of curiosity, I'm wondering what the most number of coin days destroyed is in a single transaction? I found a transaction moved by an early miner shortly after the Bitcoin Cash fork, but it was only 50 BTC so coin days destroyed is not that high. This chart shows 173M coindays destroyed on March 7th of 2014, but the chart stopped updating in 2016 (probably due to Blockchain.info data no longer serving it), and it doesn't identify the transaction or if it was even done with a single transaction (unlikely since that would be over 100k BTC moved from near Genesis).
So, what's the largest coin day movements we've seen by single individuals under a single transaction? Was it some exchange moving cold storage? The gox coins?
submitted by n4ru to Bitcoin [link] [comments]

Anyone know where I can download daily bitcoin price data (ideally as a CSV) back to 2009ish?

submitted by MadBanker01 to Bitcoin [link] [comments]

What are your favorite quandl datasets?

submitted by workn00b to algotrading [link] [comments]


I've been reading this sub for a while and I've made the point that I think many of the users are in fact more delusional than the "butters" that they criticize. I did not find the thoughful or humorous criticism of bitcoin that I was looking for here. The internet creates echo chambers and this is definitely one. So this is a thread about what is actually happening. Here's a few things I've learned that I believe to be true. Feel free to disagree.
My personal conclusion from this is that, most of the criticism of Bitcoin that occurs on this sub is not really that bitcoin isn't working but that Bitcoin has not fullfilled the overly optimistic vision put forward by some of its early proponents. So for example, it hasn't replaced credit cards. When in fact it is outperforming the expectations of its creators.
Another major criticism is that bitcoin is used for crime. I don't like drugs personally but I also know that governments that have tried to stop the drug trade have always failed and increased violence in the process. If people buy drugs online, it really is better for everyone than buying them on the street. I see the illicit use of bitcoin as part of the pioneer species of an ecosystem not the final state.
If technologies such as payment channel networks continue to develop we may see a resurgence of interest in using bitcoin for legitimate purchases.
So almost every prediction that has been touted by this sub has turned out to be wrong, and many of it's most vocal members consistently resort to conspiracy theories to explain a reality that does not fit their model.
In other words: you've become what you hate.
In other words: the skullfucks will continue until morale improves.
submitted by biglambda to Buttcoin [link] [comments]

What would be the best website where I can get information like daily active address, volume, transaction fee etc. (bitcoin and other crypto)?

But i don't like bitinfocharts.com because it has some mistakes:
  1. Sometimes active addresses less than sent from addresses (example: LTC 2016-07-29, but i see this for BTC too). This is mistake since in reality number of active addresses can't be less than "sent from addresses".
  2. I don't think that the chart bitinfocharts is correct: see 2018-01-04 - 2018-01-11 in https://bitinfocharts.com/comparison/activeaddresses-btc.html, https://www.quandl.com/data/BCHAIN/NADDU-Bitcoin-Number-of-Unique-Bitcoin-Addresses-Used, https://www.blockchain.com/en/charts/n-unique-addresses?timespan=all, second and third charts similar, but not similar to bitinfocharts.com
submitted by StasEoS to Bitcoin [link] [comments]

Tom Harding: Block Size Experiment Underway

In September, 2014, a collective experiment began in the bitcoin ecosystem. Available block space (1MB) began to sometimes fall short of the space required to mine all of the transactions that would otherwise have been included.
This chart, posted earlier, shows the onset of the some-blocks-at-maximum era.http://i.imgur.com/5Gfh9CW.png
Although the average block is only about 400K, real blocks are bigger or smaller due to the random length of time between blocks (and other factors). I look at how often this is predicted to happen.
Recently, transactions have been confirmed at a rate of about 100000/day*. The average transaction size for the past 6000 blocks has been 545 bytes. Using these values,
txesPerMinute = 100000 / 24 / 60 = 69.4 txesInMaxBlock = 999977 / 545 = 1834 minutesToFillBlock = txesInMaxBlock/txesPerMinute = 26.4
Using the theoretical formula for the time before an inter-block interval of at least a given length **
blockChickenMinutes[x] := 10 (exp(x/10) - x/10 - 1)
we obtain
minutesBetweenFullBlocks = blockChickenMinutes[minutesToFillBlock] = 104
We currently expect a maximum-size block every 1 hour + 44 minutes, on average. If the transaction rate doubles, we should expect a maximum-size block every 14 minutes, on average. The non-linearity makes sense, because doubling the average without raising the maximum requires disproportionately more maximum-size blocks.
This estimate is understated because transaction size and submission rate have their own distributions. Using the averages of 545 bytes and 100000/day ignores the fact that for some blocks, there are unusually big and/or numerous transactions, which increases the block size variance and causes blocks over the threshold to be encountered more frequently.
These calculations are confirmed by empirical observation of the most recent 6000 blocks:
In many cases, the miner chose to create a 750KB block, which is unusually likely to be followed by another 750KB or 1MB block, because the next interval starts off with a 250KB backlog. Some backlog transactions may experience more than 1 block delay in these cases.
** This is a chicken-crossing-the-road problem. Wait time = (exp(?x) ? ?x - 1) / ? Some discussion at https://github.com/nanotube/supybot-bitcoin-marketmonitopull/68.
submitted by finway to Bitcoin [link] [comments]

Some perspective on the current pessimism

The dark feeling toward the bitcoin price at the moment seems to me to lack a bit of perspective. I just ran the log linear regression based on bitstamp data, and we've only just crossed the trend line.
I'm not using the trend line as a predictive indicator. Regressions for price data are not predictive (as per basic statistics). But the trend line does give you an indication of the rate of growth. That's a log scale, so staying above the log scale for an extended period of time implies a greater than exponential growth rate. Even for a network effect type tech like bitcoin - this would be a mind boggling amount of growth. As far as I'm aware that sort of growth doesn't exist in a sustained way anywhere in the natural world!
If price to any degree maps to adoption then you would expect the price to remain below this trend. And that's historically been the case. After every bubble hitherto (look at the mt gox data for the earlier bubbles) it has always dropped sharply back below the line, before it starts rising again.
I'm personally looking to get back in very soon - now that we are back to reality - as soon as I see a trend pattern back up sustain itself for a few days, mebbe a week. But I wouldn't advise the same to the rest of you unless you still believe in the likelihood of further bitcoin adoption by the mainstream. I still do. :)
btw - here is some python code to create the regression in case folks be interested.
import json, requests import pandas as pd import numpy as np import datetime as dt import matplotlib.pyplot as plt import matplotlib.dates as mdates url = 'http://www.quandl.com/api/v1/datasets/BITCOIN/BITSTAMPUSD.json' r = requests.get(url) raw_data = r.json() data = pd.DataFrame(raw_data['data'], columns=raw_data['column_names']) data = data.reindex(index=data.index[::-1]) data['i'] = range(0,len(data)) data = data.set_index('i') data.loc[(data['Open']==1.7e+308), 'Open':] = np.nan data = data.fillna(method='pad') data['Date'] = pd.to_datetime(data['Date']) X = pd.DataFrame(index=data.index) X['0'] = 1 X['1'] = data.index Y = pd.DataFrame(index=data.index) Y['0'] = np.log10(data.loc[:, 'Weighted Price']) X = X.as_matrix() Y = Y.as_matrix() X_T = X.transpose() theta = np.linalg.inv(X_T.dot(X)).dot(X_T).dot(Y) regress = X.dot(theta) plt.gca().xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m-%d')) plt.gca().xaxis.set_major_locator(mdates.AutoDateLocator()) plt.plot(data['Date'], data['Weighted Price']) plt.plot(data['Date'], 10 ** regress) plt.axes().set_yscale('log') plt.xlabel('Date') plt.ylabel('Price (USD)') plt.grid() plt.show() 
submitted by grovulent to BitcoinMarkets [link] [comments]

Historical Data graph bitcoin (10 Sec)

Hi, I am looking for a data sheet showing last trade prices in USD. Preferebly with 10 seconds intervals and from the beginin of bitcoin untill now. And to top it of in raw data ;). Is there any one that could help me out or guide me in the right direction. Thanks!
submitted by RecLeKon to BitcoinMarkets [link] [comments]

Python and cryptocurrency trading !

Bitcoin is presently standard. While the level headed discussion on whether it's a theoretical air pocket or the best thing since the web proceeds with, one thing that is undeniable is that it has pulled in a lot of speculator enthusiasm to cryptographic money and computerized resources.
Those genuine about putting resources into computerized resources realize that there are numerous contemplations past Bitcoin. With the utilization of Python and a few information investigation, financial specialists can arm themselves with the fundamental strides on the best way to make a portfolio advanced to their individual hazard profile. Putting resources into this developing resource class is never a beyond any doubt wager, however this system ( in light of present day portfolio hypothesis) will enable you to all the more unhesitatingly expand potential returns while lessening your portfolio instability.
The perspectives contained in this post are my own and don't speak to speculation exhort, the perspectives of my boss or any other individual. This substance is planned to be utilized and should be utilized for enlightening purposes as it were. It is essential to do your own particular investigation before making any speculation in light of your very own conditions. You should accept autonomous money related counsel from an expert regarding, or freely inquire about and confirm, any data that you find in this post and wish to depend upon, regardless of whether to make a speculation choice or something else.
This article is indented for those open to utilizing Python for scripting and information examination. On the off chance that you are hoping to learn Python, I propose looking at the instructional exercises on CodeAcademy and the book Python for Data Analysis by Wes McKinney.
Stage 1 — Get the noteworthy information
Great information is vital! Without information, this activity would not be conceivable. There are various spots where you can discover memorable crypto — data relying upon what you intend to hold. A few illustrations incorporate Coindesk, Quandl, Coinmetrics.io, Coinmarketcap, and a significant number of the trades.
The case in this article utilizes memorable information, for example, that found on the Bitfinex or Poloniex trades. Utilizing people in general programming interface orders recorded here, you can download notable information in 4 hour obstructs with High, Low, Open, Close, Volume and Weighted Price for the many sets traded on the website. The emphasis here is on enhancing the portfolio returns against the US Dollar. On account of that, this uses information for sets that exchange specifically against the USD or USD Tether (tie is an intricate definition that I won't endeavor to jump into here).
One strategy to achieve this is to utilize the Python library urllib or solicitations to call the url and concentrate the information for the coveted monetary forms. Spare your information extricates locally to abstain from expecting to get them once more, ideally in a csv.
Some lovely HLOCV information
Stage 2 — Wrangle that information
The following stage is to solidify your crude information removes into a usable organization. Pandas is an incredible method to sort out your information. A straightforward method to join the information removes is by circling through the index where you store them and affixing them all to a focal dataframe. See the illustrative scrap underneath.
Case of a circle to join crude concentrates
Utilize this point as a chance to clean your information. This implies arranging by timestamp, checking for NaNs or Null esteems to fill or evacuate. You will likewise need to evacuate pointless information and change the arrangement with the goal that it can without much of a stretch be separated into clusters in view of timestamp, cash and cost.
Stage 3—Optimize!
Presently for the fun part — running a Monte Carlo style reproduction to adjust the perfect portfolio weightings! The thought here is to utilize irregular number age to reenact more than 30,000 portfolios with different holding setups. This illustration utilizes Bitcoin, Ethereum, Ethereum Classic, Litecoin, Stellar, Monero and ZCash. The era utilized was January 1 to November 1, 2017.
Case of streamlining process
The objective is to discover the portfolio setup with the most astounding Sharpe proportion, a proportion of a portfolio's arrival in respect to its instability.
Plotting the outcomes utilizing matplotlib should give you something like this:
Consequences of a streamlining reproduction with a red star on the reenactment with the most elevated Sharpe and green star on the recreation with the least instability. The shading bar is relative Sharpe proportion.
The outcomes from this reproduction can illuminate your next purchase or offer choice. Taking a gander at a perception of the yield , the reproduction with the most elevated Sharpe proportion is featured with a red star and the reenactment with the least instability is featured with a green star. For those intrigued, the recreation with the best chronicled Sharpe had the accompanying weights :
ETH 48.0210%
LTC 37.8341%
And so on 6.9683%
STR 4.2331%
BTC 1.4463%
ZEC 0.8631%
XMR 0.6341%
These outcomes don't mirror the present market! The model indicated was restricted to information up to late October and in this manner would likely look altogether different when figured today given the current BTC rally.
How might you utilize this in practice? — Rebalance Regularly!
Going ahead, this strategy can be utilized all the time to rebalance your portfolio. While backtesting this methodology, it is essential to consider how regularly you should rebalance. Additional rebalancing may prompt an all the more finely-tuned approach yet it likewise implies higher exchanging expenses. Additionally consider how much notable information is as yet applicable to the present patterns. For instance, Bitcoin information from 2013 won't not be that significant in the present market condition.
A debt of gratitude is in order for setting aside the opportunity to peruse this article. I want to post more later on about utilizing Python to enhance the contributing procedure.
Glad Trading!
submitted by GrayFox6 to CryptoCurrency [link] [comments]

Another way to look at the bubbles

This is a follow up to a post I made a couple of days ago looking at the log-linear regression of bitcoin prices:
This post is going to look at this regression again, but this time using it to make comparisons between Bitcoin's bubbles.
One way to compare Bitcoin's bubble periods is by categorizing periods on the basis of how far away they are from the regression line. The set of these distance measures is known as the 'residuals'. What I've done in this chart is colour code each price depending on where on a set of ranges its residual value falls.
How did I select the ranges I did? Semi-arbitrarily (if anyone knows of a less-arbitrary way to do this let me know). The residuals seem to cluster to some degree around certain ranges when you look at them in a histogram. So that's why I chose the ranges that I did.
What I think a chart like this is representing is how far away bitcoin is from its exponential growth trend. So a bubble is interpreted relative to that growth trend and not its absolute growth in value. This gives different results from when you just look at absolute price growth.
For example - if you compare the 2012/13 bubble with the 2013/14 bubble on the basis of the absolute amount of price growth then the former of these is the larger bubble with about 13x price growth compared to about 8x. But from the perspective of the price relative to the exponential growth trend, the latter is clearly larger. How is this possible, you can see it easily from the chart. The 2012/13 bubble came off a much larger slump relative to the trend (a blue zone) - whereas the 2013/14 bubble had less ground to make up. It started in a green zone. Furthermore, while the 2012/13 bubble makes it into the orange zone - it only stays there briefly. It's really mostly a yellow level event. The 2013/14 bubble is is definitively an orange level event and even gets into the red zone for a day.
The 2011 bubble is less interesting from this perspective since it is big in all respects - both price growth and its residuals. Notice also how some quite large price increases (3x in jun-aug 2012) don't even count as bubbles on this analysis even though they have the same triangle shape as the others.
In terms of the current downtrend it provides an extra rationalisation as to why things aren't that grim. We are currently in a green period - which is the most common colour by far. Green is not a bad place to be if you care about whether or not Bitcoin is maintaining its long term trend. Blue is where things start to get concerning. If that chart ever registers a significant period of purple - that's when I'm going to start to freak out for bitcoin's long term future.
As always - take this stuff with large pinches of salt. If you look at this stuff yourself, I recommend trying to come up with alternative (principled) ways of choosing your colour regions. You can also experiment with using different time frames. If you start your regression at the beginning of the 2012/13 bubble then it becomes much larger than the 2013/14 bubble. I personally don't think this is appropriate if you are looking at the long term exponential growth - but you gotta bear it in mind.
One other thing - this data swaps from the mtgox to bitstamp data. I feel mtgox data is generally to be avoided, because of the way it skewed the market - but it's all we have from the early periods.
Here is the code for those who want to play. About to hit the town for some booze n ladies... so won't respond to comments questions (if any) until tomorrow. :)
import json, requests import pandas as pd import numpy as np import datetime as dt import operator import matplotlib.pyplot as plt import matplotlib.dates as mdates def get_data(api_name): ''' pulls data from the quandl api api_name either 'BITSTAMPUSD' or 'MTGOXUSD' ''' url = 'http://www.quandl.com/api/v1/datasets/BITCOIN/{0}.json'.format(api_name) r = requests.get(url) raw_data = r.json() # put the data in a Pandas DataFrame object data = pd.DataFrame(raw_data['data'], columns=raw_data['column_names']) # change the order so the earliest records are first data = data.reindex(index=data.index[::-1]) # reset the index order increasing from 0 data['i'] = range(0,len(data)) data = data.set_index('i') return data bitstamp_data = get_data('BITSTAMPUSD') mtgox_data = get_data('MTGOXUSD') #change the dates to python datetime objects bitstamp_data['Date'] = pd.to_datetime(bitstamp_data['Date']) mtgox_data['Date'] = pd.to_datetime(mtgox_data['Date']) # select the mt gox data prior to bitstamp pre_bitstamp = mtgox_data[(mtgox_data['Date'] < bitstamp_data['Date'][0])] # whack the mtgox and bitstamp data together data = pre_bitstamp.append(bitstamp_data) data['i'] = range(0,len(data)) data = data.set_index('i') # Data has some bad values. Replace them with previous days data # First replace the bad values with a NaN value data.loc[(data['Open']==1.7e+308), 'Open':] = np.nan # now replace the NaN values with the previous days values data = data.fillna(method='pad') # create X and Y for the regression X = pd.DataFrame(index=data.index) X['0'] = 1 X['1'] = data.index # convert the price data to a log10 scale Y = pd.DataFrame(index=data.index) Y['0'] = np.log10(data.loc[:, 'Weighted Price']) # convert to numpy matrices for use in the normal equation X = X.as_matrix() Y = Y.as_matrix() # Normal equation (works much better than gradient descent in this case) X_T = X.transpose() theta = np.linalg.inv(X_T.dot(X)).dot(X_T).dot(Y) # use theta to plot the regression regress = X.dot(theta) # create a histogram of the residuals (difference between actual and predicted values) diff = Y - regress bins = np.linspace(diff.min()-0.2, diff.max()+0.2 , 80) plt.hist(diff, bins) plt.show() # select groups of residuals for colouring # ranges are selected on the (semi-arbitrary) basis of how they appear to be grouped in the # histogram above. purple = ((diff > -1.0) & (diff < -0.65)).flatten() blue = ((diff > -0.65) & (diff < -0.31)).flatten() green = ((diff > -0.31) & (diff < 0.0)).flatten() yellow = ((diff > 0.0) & (diff < 0.27)).flatten() orange = ((diff > 0.27) & (diff < 0.6)).flatten() red = ((diff > 0.6) & (diff < 1)).flatten() pink = (diff > 1.0).flatten() # Create the chart # Let matplotlib do the work of selecting date ticks plt.gca().xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m-%d')) plt.gca().xaxis.set_major_locator(mdates.AutoDateLocator()) # plotting the raw price data and then log scaling the chart plt.plot(data['Date'], data['Weighted Price']) # regression data is already in log scale, so scale it back up. plt.plot(data['Date'], 10 ** regress) # apply the log scale to the chart plt.axes().set_yscale('log') plt.xlabel('Date') plt.ylabel('Price (USD)') plt.grid() # Plot the residuals plt.scatter(data['Date'][purple], data['Weighted Price'][purple], marker='x', c='#2E0854', label="-1.0 < res < -0.65") plt.scatter(data['Date'][blue], data['Weighted Price'][blue], marker='x', c='b', label="-0.65 < res < -0.31") plt.scatter(data['Date'][green], data['Weighted Price'][green], marker='x', c='g', label="-0.31 < res < 0.0") plt.scatter(data['Date'][yellow], data['Weighted Price'][yellow], marker='x', c='y', label="0.0 < res < 0.27") plt.scatter(data['Date'][orange], data['Weighted Price'][orange], marker='x', c='#FF6600', label="0.27 < res < 0.6") plt.scatter(data['Date'][red], data['Weighted Price'][red], marker='x', c='r', label="0.6 < res < 1.0") plt.scatter(data['Date'][pink], data['Weighted Price'][pink], marker='x', c='#ff69b4', label="1.0 < res") # add the legend handles, labels = plt.axes().get_legend_handles_labels() plt.legend(handles, labels, loc=2) plt.show() 
submitted by grovulent to BitcoinMarkets [link] [comments]

Thoughts on Bitcoin Cash and why it actually might succeed.

I honestly thought Bitcoin Cash (BCC) was a completely stupid idea, completely disregards hash rate is king rule. The reason why UASF never had a chance to succeed was exactly this same reason.
BCC is a little different though, they aren't trying to take over the Bitcoin main chain, which UASF and SegWit goal is to do. They are literally forking off to a cloned separate alt-coin, changing a few rules to make on-chain scaling priority, and letting the users ultimately decide which chain to use (by exchanging the corresponding tokens and mining on the chain).
So, under normal circumstances over the past 8 years, this would be a horrible idea, as there are thousands of alt-coins and clone coins and ICO's everyday. "Bitcoin" is king on the block so to speak, and other coins are unimportant. The BCC alt-coin would literally be dead in the water, or if anything support a small market cap and low volume and be insignificant forever.
However, things changed in February 2016. Bitcoin began hitting the 1MB blocksize cap limit, and transactions went from fast and cheap, to slow and expensive. Bitcoin Core and Blockstream, the maintainers of Bitcoin for the past few recent years, had failed to enact a scaling plan, for 2 reasons (take your pick or probably both). 1) They failed to see a problem with full blocks and high tx fees/slow confirmation 2) They want to force their "fix" which bundled in massive new changes to the Bitcoin function, despite it being very unpopular and barely earning over 30% of Hash Rate support.
Blockstream and Bitcoin Core essentially though Bitcoins success is guaranteed, and they could do almost anything to it without negative consequence. Then came reality, which we all know in the past year is resulting in massive market share loss of Bitcoin compared to other coins and "ICO's."
So, rewinding a bit, at pretty much any other time in Bitcoins history, BCC would have been another doomed altcoin to be lucky to even see a fraction % of market share. However, 2016/2017 is a new world of ICO's and alt coins which were ushered in by failure of Bitcoin management. So, this is exactly why BCC might actually be very successful, the market and community are open to ICO's and alt's, and Blockstream/Core have essentially paved the way for this to be a success.
1) People and market are familiar with BCC (its basically Bitcoin with bigger blocks) 2) There is no "scammy" premine, as a way for people to look to get rich quick/ 3) Bitcoin actual holders and users are given a choice which way to vote on the future direction of Bitcoin
So, BBC may be successful, or may fail horribly. However, at no other point would there have been a better time for it to actually be possible, and that opportunity was definitely given by the failure of Bitcoin Core/Blockstream team.
submitted by squarepush3r to btc [link] [comments]

AttributeError: 'datetime.datetime' object has no attribute 'timestamp'

Please Help - I keep receiving the following Traceback Error:
Currently Running Python 2.0
I'm attempting to utilize Python's Plotly library to display an infographic illustrating bitcoin prices. I've tried importing datetime at the top of my code but this doesn't appear to solve the problem.
Traceback (most recent call last): File "project_one.py", line 165, in crypto_price_df = get_crypto_data(coinpair) File "project_one.py", line 155, in get_crypto_data json_url = base_polo_url.format(poloniex_pair, start_date.timestamp(), end_date.timestamp(), pediod) AttributeError: 'datetime.datetime' object has no attribute 'timestamp'
import numpy as np import pandas as pd from pandas import Series, DataFrame, Panel import matplotlib.pyplot as plt plt.style.use('fivethirtyeight') import seaborn as sns import sklearn as sk import scipy as sp import os import pickle import quandl import datetime import plotly.plotly as py import plotly.graph_objs as go import plotly.figure_factory as ff from plotly import tools from plotly.offline import iplot, init_notebook_mode from IPython.display import display, HTML init_notebook_mode(connected=True) def get_quandl_data(quandl_id): cache_path = '{}.pkl'.format(quandl_id).replace('/','-') try: f = open(cache_path, 'rb') df = pickle.load(f) print('Loaded {} from cache'.format(quandl_id)) except (OSError, IOError) as e: print('Downloading {} from Quandl'.format(quandl_id)) df = quandl.get(quandl_id, returns="pandas") df.to_pickle(cache_path) print('Cached {} at {}'.format(quandl_id, cache_path)) return df btc_usd_price_kraken = get_quandl_data('BCHARTS/KRAKENUSD') exchanges = ['COINBASE','BITSTAMP','ITBIT'] exchange_data = {} exchange_data['KRAKEN'] = btc_usd_price_kraken for exchange in exchanges: exchange_code = 'BCHARTS/{}USD'.format(exchange) btc_exchange_df = get_quandl_data(exchange_code) exchange_data[exchange] = btc_exchange_df def merge_dfs_on_column(dataframes, labels, col): series_dict = {} for index in range(len(dataframes)): series_dict[labels[index]] = dataframes[index][col] return pd.DataFrame(series_dict) btc_usd_datasets = merge_dfs_on_column(list(exchange_data.values()), list(exchange_data.keys()), 'Weighted Price') def df_scatter(df, title, seperate_y_axis=False, y_axis_label='', scale='linear', initial_hide=False): label_arr = list(df) series_arr = list(map(lambda col: df[col], label_arr)) layout = go.Layout( title=title, legend=dict(orientation="h"), xaxis=dict(type='date'), yaxis=dict( title=y_axis_label, showticklabels= not seperate_y_axis, type=scale ) ) y_axis_config = dict( overlaying='y', showticklabels=False, type=scale ) visibility = 'visible' if initial_hide: visibility = 'legendonly' trace_arr = [] for index, series in enumerate(series_arr): trace = go.Scatter( x=series.index, y=series, name=label_arr[index], visible=visibility ) if seperate_y_axis: trace['yaxis'] = 'y{}'.format(index + 1) layout['yaxis{}'.format(index + 1)] = y_axis_config trace_arr.append(trace) fig = go.Figure(data=trace_arr, layout=layout) py.plot(fig) df_scatter(btc_usd_datasets, 'Bitcoin Price (USD) By Exchange') btc_usd_datasets.replace(0, np.nan, inplace=True) df_scatter(btc_usd_datasets, 'Bitcoin Price (USD) By Exchange') btc_usd_datasets['avg_btc_price_usd'] = btc_usd_datasets.mean(axis=1) btc_trace = go.Scatter(x=btc_usd_datasets.index, y=btc_usd_datasets['avg_btc_price_usd']) py.plot([btc_trace]) def get_json_data(json_url, cache_path): try: f = open(cache_path, 'rb') df = pickle.load(f) print('Loaded {} from cache'.format(json_url)) except (OSError, IOError) as e: print('Downloading {}'.format(json_url)) df = pd.read_json(json_url) df.to_pickle(cache_path) print('Cached {} at {}'.format(json_url, cache_path)) return df base_polo_url = 'https://poloniex.com/public? command=returnChartData¤cyPair={}&start= {}&end={}&period={}' start_date = datetime.datetime.strptime('2015-01-01', '%Y-%m-%d') end_date = datetime.datetime.now() pediod = 86400 # pull daily data (86,400 seconds per day) def get_crypto_data(poloniex_pair): json_url = base_polo_url.format(poloniex_pair, start_date.timestamp(), end_date.timestamp(), pediod) data_df = get_json_data(json_url, poloniex_pair) data_df = data_df.set_index('date') return data_df altcoins = ['ETH','LTC','XRP','ETC','STR','DASH','SC','XMR','XEM'] altcoin_data = {} for altcoin in altcoins: coinpair = 'BTC_{}'.format(altcoin) crypto_price_df = get_crypto_data(coinpair) altcoin_data[altcoin] = crypto_price_df 
submitted by bullybear17 to learnpython [link] [comments]

Cost of flooding the network with transactions to demonstrate necessity of increasing block size (back of the envelope)

There's a lot of discussion nowadays on the max block size limit of 1 MB.
If you look at the average total transaction fees per day http://www.quandl.com/BCHAIN/TRFEE-Bitcoin-Total-Transaction-Fees then that is around 12BTC.
The average block size nowadays is 1/5 of the maximum of 1MB: http://www.quandl.com/BCHAIN/AVBLS-Bitcoin-Average-Block-Size
We talk a lot of the prohibitively high cost of an attack on the network in terms of the cost of mining hardware and processing power, but it seems to me that with the current max block size of 1MB, a much cheaper way of wreaking havoc would be to flood the network with transactions and block out most of the normal payment traffic.
The transaction fee cost of sending 5 times the average volume on transactions would be 60 BTC/day. If you double your transaction fee, in order to outcompete most of the other traffic, that would be 120 BTC for a day.
Most clients have transaction fees hardcoded and would not be able to respond quickly. 120 BTC is not the kind of money an individual would want to spend to prove a point, but within reach of any organization that wants to demonstrate Bitcoin is not able to scale.
submitted by cryptopascal to Bitcoin [link] [comments]

Savoir Quand Acheter du Bitcoin Bitcoin Analyse : Ce site vous dit Quand Acheter du bitcoin Quandl and Excel Comment investir dans le bitcoin QUAND LES BALEINES ACHÈTENT DU BITCOIN ?

Quandl uses cookies. This website utilizes cookies and similar technologies for functionality and other purposes. Your use of this website constitutes your acceptance of cookies. To learn more about our cookies and the choices we offer, please see our Cookie Policy. You can also manage your settings here. Our research arm publishes a variety of reports and reference papers, including weekly and monthly Bitcoin & Blockchain market, industry and technology reports. Publisher Page . More Data feeds From Brave New Coin. BNC Liquid Index. Updated daily, this database contains the first true historical price for bitcoin, built specifically for institutional use as derived by the robust BNC Bitcoin ... Search Quandl's full set of data products by sector, country, data type, vendor, and keyword. On Quandl you'll find financial, economic, demographic, psychographic, and geographic datasets from reputable sources. Using the Quandl API for Bitcoin Data. This document is a comprehensive guide to using the Quandl API to access our free bitcoin data. If you haven’t already done so, we recommend reading Quandl’s general API documentation; the functionality will be a lot clearer if you do so. Free Unlimited API for Bitcoin Data . Quandl offers free Bitcoin exchange rates for 30+ currencies from a variety ... Using Quandl Bitcoin Data to Build a Time Series Forecast in Python. Ronnie Fecso. Follow. Dec 3, 2017 · 6 min read. Anyone interested in trading Bitcoin obviously would love to know future ...

[index] [48760] [15465] [14311] [8595] [3873] [38576] [33139] [31806] [27947] [22136]

Savoir Quand Acheter du Bitcoin

Comment investir dans le bitcoin ? Pourquoi cette vidéo : le bitcoin ou BTC est une devise numérique, un système de paiement peer-to-peer mis sur pied par un développeur de logiciels connu ... 🔻🔻🔻🔻 IMPORTANT LINKS BELOW 🔻🔻🔻🔻 Quandl Data - Drawing Tool Updates: In this video, we share some updates from the past few weeks from TradingView. We discuss... bitcoin - l'issue est proche ! + ethereum & cardano pump ! INVESTISSEMENT/TRADING - crypto monnaie - Duration: 30:44. Cryptanalyst - Analyses crypto FR 5,418 views This video demonstrates how to connect ArthaChitra with Quandl. Quandl provides both free and premium EOD data from exchanges across the globe, including NSE, BSE, MCX etc. Bitcoin 2019 : Ce signal dit quand acheter et quand vendre - Duration: 11:21. Jonathan Nowak 3,323 views. 11:21. Bitcoin Trading for Beginners (A Guide in Plain English) - Duration: 18:48. ...