Bitcoin (BTC), Tron (TRX) and NEO (NEO) Price Analysis ...

Why the increased buying at Greyscale does not translate into higher BTC prices. An opinional analysis.

BREAKING: Greyscale Bitcoin Trust just bought another 134,000 BTC in the last hour!!!!! Why is the price NOT going up????????
The above headline is what everyone else is thinking. I mean come on - with all of the buying that greyscale is doing....why isn’t BTC 20,000 bucks by now.
The reason is simple: Although Greyscale did add X amount of Bitcoin to their Bitcoin fund - it does not add NET NEW BUYERS of BITCOIN as assumed.
Let me elaborate.
The Greyscale Bitcoin Trust (GBTC) allows investors to fund their shares of GBTC in cash....AND BITCOIN! What that means is people can give GBTC their BTC and receive common shares of GBTC in return. GBTC the common shares trade on the OTCQX private market exchange in the US....at a premium.
What it means to trade at a premium: it means that the price per common share of BTC is higher than the actual amount of BTC held in the fund.
Premium Illustration with fake numbers:
The entire GBTC fund has 100 bitcoin at $1,000 per Bitcoin meaning the entire fund should be worth 100 BTC* 1000 = $100,000!
As there are 100 bitcoin in the entire fund and bitcoin is trading at a thousand bucks - obviously the whole fund is worth $100k.
The common equity of GBTC would be trading a premium when the equity value is higher than $100k. For simplicity let’s say the total market cap (total value of public equity) is $110k - this indicates that the equity value of GBTC is trading at a 10% premium to market because the fund is clearly ONLY worth $100k yet the equity value is trading at $110k - the premium portion would be $10k.
Why does this happen: it is easier for investors to buy shares in GBTC with the click of a button in their brokerage account then it is to buy BTC the old fashioned way on an exchange like Gemini.
Now that we understand what a premium is and how it relates to GBTC: how does this relate to the bitcoin price not going up? It means that original holders of Bitcoin can arbitrage the GBTC Premium quite easily through funding their accounts of GBTC IN BTC then selling their common equity on the open market at the premium. The extra value gained is the premium. As the investor funded their account in BTC - no new Bitcoin buying actually occurred; someone GAVE Greyscale BTC, got equity back, then sold it for cash on the open market. People are doing this because the GBTC premium is so high that it’s way more lucrative selling through GBTC then going on Gemini or some other exchange.
Lucrative traders can exploit the GBTC premium but it’s important to note that new private investors of GBTC are subject to a 6 month hold on their shares. They’d be illiquid for awhile. This allows for savvy BTC investors to trade the GBTC premium twice per year.
Sample series of Trades:
Step 1: I have 1 Bitcoin. Step 2: I choose to give it to GBTC and get shares back. Step 3: I give GBTC my BTC and they give me shares in the fund which I cannot sell for 6 months. Step 4: 6 months later my shares are 15% higher premium. Step 5: sell my GBTC common shares for cash. Step 6: Take cash.
The above analysis ignores GBTC investment fund fees and trading fees.
See what I am saying people? Bitcoin traders can exploit the premium in the market for GBTC’s common shares by funding their accounts in BTC and selling the premium. People contributing BTC to GBTC is NOT the same as GBTC raising cash and buying BTC - effectively taking circulating BTC out of the market into a permanent HODL.
submitted by Aggravating_Boss_515 to Bitcoin [link] [comments]

A Physicist's Bitcoin Trading Strategy. No leverage, no going short, just spot trading. Total cumulative outperformance 2011-2020: 13,000,000%.

https://www.tradingview.com/script/4J5psNDo-A-Physicist-s-Bitcoin-Trading-Strategy/
3. Backtest Results
Backtest results demonstrate significant outperformance over buy-and-hold . The default parameters of the strategy/indicator have been set by the author to achieve maximum (or, close to maximum) outperformance on backtests executed on the BTCUSD ( Bitcoin ) chart. However, significant outperformance over buy-and-hold is still easily achievable using non-default parameters. Basically, as long as the parameters are set to adequately capture the full character of the market, significant outperformance on backtests is achievable and is quite easy. In fact, after some experimentation, it seems as if underperformance hardly achievable and requires deliberately setting the parameters illogically (e.g. setting one parameter of the slow indicator faster than the fast indicator). In the interest of providing a quality product to the user, suggestions and guidelines for parameter settings are provided in section (6). Finally, some metrics of the strategy's outperformance on the BTCUSD chart are listed below, both for the default (optimal) parameters as well as for a random sample of parameter settings that adhere to the guidelines set forth in section (6).
Using the default parameters, relative to buy-and-hold strategy, backtested from August 2011 to August 2020,
Using the default parameters, relative to buy-and-hold strategy, during specific periods,
Using a random sample (n=20) of combinations of parameter settings that adhere to the guidelines outlined in section (6), relative to buy-and-hold strategy, backtested from August 2011 to August 2020,
EDIT (because apparently not everybody bothers to read the strategy's description):
7. General Remarks About the Indicator
Other than some exponential moving averages, no traditional technical indicators or technical analysis tools are employed in this strategy. No MACD , no RSI , no CMF , no Bollinger bands , parabolic SARs, Ichimoku clouds , hoosawatsits, XYZs, ABCs, whatarethese. No tea leaves can be found in this strategy, only mathematics. It is in the nature of the underlying math formula, from which the indicator is produced, to quickly identify trend changes.
8. Remarks About Expectations of Future Results and About Backtesting
8.1. In General As it's been stated in many prospectuses and marketing literature, "past performance is no guarantee of future results." Backtest results are retrospective, and hindsight is 20/20. Therefore, no guarantee can, nor should, be expressed by me or anybody else who is selling a financial product (unless you have a money printer, like the Federal Reserve does).
8.2. Regarding This Strategy No guarantee of future results using this strategy is expressed by the author, not now nor at any time in the future.
With that written, the author is free to express his own expectations and opinions based on his intimate knowledge of how the indicator works, and the author will take that liberty by writing the following: As described in section (7), this trading strategy does not include any traditional technical indicators or TA tools (other than smoothing EMAs). Instead, this strategy is based on a principle that does not change, it employs a complex indicator that is based on a math formula that does not change, and it places trades based on five simple rules that do not change. And, as described in section (2.1), the indicator is designed to capture the full character of the market, from a macro/global scope down to a micro/local scope. Additionally, as described in section (3), outperformance of the market for which this strategy was intended during backtesting does not depend on luckily setting the parameters "just right." In fact, all random combinations of parameter settings that followed the guidelines outperformed the intended market in backtests. Additionally, no parameters are included within the underlying math formula from which the indicator is produced; it is not as if the formula contains a "5" and future outperformance would depend on that "5" being a "6" instead. And, again as described, it is in the nature of the formula to quickly identify trend changes. Therefore, it is the opinion of the author that the outperformance of this strategy in backtesting is directly attributable to the fundamental nature of the math formula from which the indicator is produced. As such, it is also the opinion of the author that continued outperformance by using this strategy, applied to the crypto ( Bitcoin ) market, is likely, given that the parameter settings are set reasonably and in accordance with the guidelines. The author does not, however, expect future outperformance of this strategy to match or exceed the outperformance observed in backtests using the default parameters, i.e. it probably won't outperform by anything close to 13,000,000% during the next 9 years.
Additionally, based on the rolling 1-month outperformance data listed in section (3), expectations of short-term outperformance should be kept low; the median 1-month outperformance was -2%, so it's basically a 50/50 chance that any significant outperformance is seen in any given month. The true strength of this strategy is to be out of the market during large, sharp declines and capitalizing on the opportunities presented at the bottom of those declines by buying the dip. Given that such price action does not happen every month, outperformance in the initial months of use is approximately as likely as underperformance.
submitted by anon2414691 to BitcoinMarkets [link] [comments]

Testing the Tide | Monthly FIRE Portfolio Update - June 2020

We would rather be ruined than changed.
-W H Auden, The Age of Anxiety
This is my forty-third portfolio update. I complete this update monthly to check my progress against my goal.
Portfolio goal
My objective is to reach a portfolio of $2 180 000 by 1 July 2021. This would produce a real annual income of about $87 000 (in 2020 dollars).
This portfolio objective is based on an expected average real return of 3.99 per cent, or a nominal return of 6.49 per cent.
Portfolio summary
Vanguard Lifestrategy High Growth Fund – $726 306
Vanguard Lifestrategy Growth Fund – $42 118
Vanguard Lifestrategy Balanced Fund – $78 730
Vanguard Diversified Bonds Fund – $111 691
Vanguard Australian Shares ETF (VAS) – $201 745
Vanguard International Shares ETF (VGS) – $39 357
Betashares Australia 200 ETF (A200) – $231 269
Telstra shares (TLS) – $1 668
Insurance Australia Group shares (IAG) – $7 310
NIB Holdings shares (NHF) – $5 532
Gold ETF (GOLD.ASX) – $117 757
Secured physical gold – $18 913
Ratesetter (P2P lending) – $10 479
Bitcoin – $148 990
Raiz app (Aggressive portfolio) – $16 841
Spaceship Voyager app (Index portfolio) – $2 553
BrickX (P2P rental real estate) – $4 484
Total portfolio value: $1 765 743 (+$8 485 or 0.5%)
Asset allocation
Australian shares – 42.2% (2.8% under)
Global shares – 22.0%
Emerging markets shares – 2.3%
International small companies – 3.0%
Total international shares – 27.3% (2.7% under)
Total shares – 69.5% (5.5% under)
Total property securities – 0.3% (0.3% over)
Australian bonds – 4.7%
International bonds – 9.4%
Total bonds – 14.0% (1.0% under)
Gold – 7.7%
Bitcoin – 8.4%
Gold and alternatives – 16.2% (6.2% over)
Presented visually, below is a high-level view of the current asset allocation of the portfolio.
[Chart]
Comments
The overall portfolio increased slightly over the month. This has continued to move the portfolio beyond the lows seen in late March.
The modest portfolio growth of $8 000, or 0.5 per cent, maintains its value at around that achieved at the beginning of the year.
[Chart]
The limited growth this month largely reflects an increase in the value of my current equity holdings, in VAS and A200 and the Vanguard retail funds. This has outweighed a small decline in the value of Bitcoin and global shares. The value of the bond holdings also increased modestly, pushing them to their highest value since around early 2017.
[Chart]
There still appears to be an air of unreality around recent asset price increases and the broader economic context. Britain's Bank of England has on some indicators shown that the aftermath of the pandemic and lockdown represent the most challenging financial crisis in around 300 years. What is clear is that investor perceptions and fear around the coronavirus pandemic are a substantial ongoing force driving volatility in equity markets (pdf).
A somewhat optimistic view is provided here that the recovery could look more like the recovery from a natural disaster, rather than a traditional recession. Yet there are few certainties on offer. Negative oil prices, and effective offers by US equity investors to bail out Hertz creditors at no cost appear to be signs of a financial system under significant strains.
As this Reserve Bank article highlights, while some Australian households are well-placed to weather the storm ahead, the timing and severity of what lays ahead is an important unknown that will itself feed into changes in household wealth from here.
Investments this month have been exclusively in the Australian shares exchange-traded fund (VAS) using Selfwealth.* This has been to bring my actual asset allocation more closely in line with the target split between Australian and global shares.
A moving azimuth: falling spending continues
Monthly expenses on the credit card have continued their downward trajectory across the past month.
[Chart]
The rolling average of monthly credit card spending is now at its lowest point over the period of the journey. This is despite the end of lockdown, and a slow resumption of some more normal aspects of spending.
This has continued the brief period since April of the achievement of a notional and contingent kind of financial independence.
The below chart illustrates this temporary state, setting out the degree to which portfolio distributions cover estimated total expenses, measured month to month.
[Chart]
There are two sources of volatility underlying its movement. The first is the level of expenses, which can vary, and the second is the fact that it is based on financial year distributions, which are themselves volatile.
Importantly, the distributions over the last twelve months of this chart is only an estimate - and hence the next few weeks will affect the precision of this analysis across its last 12 observations.
Estimating 2019-20 financial year portfolio distributions
Since the beginning of the journey, this time of year usually has sense of waiting for events to unfold - in particular, finding out the level of half-year distributions to June.
These represent the bulk of distributions, usually averaging 60-65 per cent of total distributions received. They are an important and tangible signpost of progress on the financial independence journey.
This is no simple task, as distributions have varied in size considerably.
A part of this variation has been the important role of sometimes large and lumpy capital distributions - which have made up between 30 to 48 per cent of total distributions in recent years, and an average of around 15 per cent across the last two decades.
I have experimented with many different approaches, most of which have relied on averaging over multi-year periods to even out the 'peaks and troughs' of how market movements may have affected distributions. The main approaches have been:
Each of these have their particular simplifications, advantages and drawbacks.
Developing new navigation tools
Over the past month I have also developed more fully an alternate 'model' for estimating returns.
This simply derives a median value across a set of historical 'cents per unit' distribution data for June and December payouts for the Vanguard funds and exchange traded funds. These make up over 96 per cent of income producing portfolio assets.
In other words, this model essentially assumes that each Vanguard fund and ETF owned pays out the 'average' level of distributions this half-year, with the average being based on distribution records that typically go back between 5 to 10 years.
Mapping the distribution estimates
The chart below sets out the estimate produced by each approach for the June distributions that are to come.
[Chart]
Some observations on these findings can be made.
The lowest estimate is the 'adjusted GFC income' observation, which essentially assumes that the income for this period is as low as experienced by the equity and bond portfolio during the Global Financial Crisis. Just due to timing differences of the period observed, this seems to be a 'worst case' lower bound estimate, which I do not currently place significant weight on.
Similarly, at the highest end, the 'average distribution rate' approach simply assumes June distributions deliver a distribution equal to the median that the entire portfolio has delivered since 1999. With higher interest rates, and larger fixed income holdings across much of that time, this seems an objectively unlikely outcome.
Similarly, the delivery of exactly the income suggested by long-term averages measured across decades and even centuries would be a matter of chance, rather than the basis for rational expectations.
Central estimates of the line of position
This leaves the estimates towards the centre of the chart - estimates of between around $28 000 to $43 000 as representing the more likely range.
I attach less weight to the historical three-year average due to the high contribution of distributed capital gains over that period of growth, where at least across equities some capital losses are likely to be in greater presence.
My preferred central estimate is the model estimate (green) , as it is based in historical data directly from the investment vehicles rather than my own evolving portfolio. The data it is based on in some cases goes back to the Global Financial Crisis. This estimate is also quite close to the raw average of all the alternative approaches (red). It sits a little above the 'adjusted income' measure.
None of these estimates, it should be noted, contain any explicit adjustment for the earnings and dividend reductions or delays arising from COVID-19. They may, therefore represent a modest over-estimate for likely June distributions, to the extent that these effects are more negative than those experienced on average across the period of the underlying data.
These are difficult to estimate, but dividend reductions could easily be in the order of 20-30 per cent, plausibly lowering distributions to the $23 000 to $27 000 range. The recently announced forecast dividend for the Vanguard Australian Shares ETF (VAS) is, for example, the lowest in four years.
As seen from chart above, there is a wide band of estimates, which grow wider still should capital gains be unexpectedly distributed from the Vanguard retail funds. These have represented a source of considerable volatility. Given this, it may seem fruitless to seek to estimate these forthcoming distributions, compared to just waiting for them to arrive.
Yet this exercise helps by setting out reasoning and positions, before hindsight bias urgently arrives to inform me that I knew the right answer all along. It also potentially helps clearly 'reject' some models over time, if the predictions they make prove to be systematically incorrect.
Progress
Progress against the objective, and the additional measures I have reached is set out below.
Measure Portfolio All Assets
Portfolio objective – $2 180 000 (or $87 000 pa) 81.0% 109.4%
Credit card purchases – $71 000 pa 98.8% 133.5%
Total expenses – $89 000 pa 79.2% 106.9%
Summary
The current coronavirus conditions are affecting all aspects of the journey to financial independence - changing spending habits, leading to volatility in equity markets and sequencing risks, and perhaps dramatically altering the expected pattern of portfolio distributions.
Although history can provide some guidance, there is simply no definitive way to know whether any or all of these changes will be fundamental and permanent alterations, or simply data points on a post-natural disaster path to a different post-pandemic set of conditions. There is the temptation to fit past crises imperfectly into the modern picture, as this Of Dollars and Data post illustrates well.
Taking a longer 100 year view, this piece 'The Allegory of the Hawk and Serpent' is a reminder that our entire set of received truths about constructing a portfolio to survive for the long-term can be a product of a sample size of one - actual past history - and subject to recency bias.
This month has felt like one of quiet routines, muted events compared to the past few months, and waiting to understand more fully the shape of the new. Nonetheless, with each new investment, or week of lower expenditure than implied in my FI target, the nature of the journey is incrementally changing - beneath the surface.
Small milestones are being passed - such as over 40 per cent of my equity holdings being outside of the the Vanguard retail funds. Or these these retail funds - which once formed over 95 per cent of the portfolio - now making up less than half.
With a significant part of the financial independence journey being about repeated small actions producing outsized results with time, the issue of maintaining good routines while exploring beneficial changes is real.
Adding to the complexity is that embarking on the financial journey itself is likely to change who one is. This idea, of the difficulty or impossibility of knowing the preferences of a future self, is explored in a fascinating way in this Econtalk podcast episode with a philosophical thought experiment about vampires. It poses the question: perhaps we can never know ourselves at the destination? And yet, who would rationally choose ruin over any change?
The post, links and full charts can be seen here.
submitted by thefiexpl to fiaustralia [link] [comments]

Murmurs of the Sea | Monthly Portfolio Update - March 2020

Only the sea, murmurous behind the dingy checkerboard of houses, told of the unrest, the precariousness, of all things in this world.
-Albert Camus, The Plague
This is my fortieth portfolio update. I complete this update monthly to check my progress against my goal.
Portfolio goal
My objective is to reach a portfolio of $2 180 000 by 1 July 2021. This would produce a real annual income of about $87 000 (in 2020 dollars).
This portfolio objective is based on an expected average real return of 3.99 per cent, or a nominal return of 6.49 per cent.
Portfolio summary
Vanguard Lifestrategy High Growth Fund – $662 776
Vanguard Lifestrategy Growth Fund – $39 044
Vanguard Lifestrategy Balanced Fund – $74 099
Vanguard Diversified Bonds Fund – $109 500
Vanguard Australian Shares ETF (VAS) – $150 095
Vanguard International Shares ETF (VGS) – $29 852
Betashares Australia 200 ETF (A200) – $197 149
Telstra shares (TLS) – $1 630
Insurance Australia Group shares (IAG) – $7 855
NIB Holdings shares (NHF) – $6 156
Gold ETF (GOLD.ASX) – $119 254
Secured physical gold – $19 211
Ratesetter (P2P lending) – $13 106
Bitcoin – $115 330
Raiz* app (Aggressive portfolio) – $15 094
Spaceship Voyager* app (Index portfolio) – $2 303
BrickX (P2P rental real estate) – $4 492
Total portfolio value: $1 566 946 (-$236 479 or -13.1%)
Asset allocation
Australian shares – 40.6% (4.4% under)
Global shares – 22.3%
Emerging markets shares – 2.3%
International small companies – 3.0%
Total international shares – 27.6% (2.4% under)
Total shares – 68.3% (6.7% under)
Total property securities – 0.2% (0.2% over)
Australian bonds – 4.8%
International bonds – 10.4%
Total bonds – 15.2% (0.2% over)
Gold – 8.8%
Bitcoin – 7.4%
Gold and alternatives – 16.2% (6.2% over)
Presented visually, below is a high-level view of the current asset allocation of the portfolio.
Comments
This month saw an extremely rapid collapse in market prices for a broad range of assets across the world, driven by the acceleration of the Coronavirus pandemic.
Broad and simultaneous market falls have resulted in the single largest monthly fall in portfolio value to date of around $236 000.
This represents a fall of 13 per cent across the month, and an overall reduction of more the 16 per cent since the portfolio peak of January.
[Chart]
The monthly fall is over three times more severe than any other fall experienced to date on the journey. Sharpest losses have occurred in Australian equities, however, international shares and bonds have also fallen.
A substantial fall in the Australia dollar has provided some buffer to international equity losses - limiting these to around 8 per cent. Bitcoin has also fallen by 23 per cent. In short, in the period of acute market adjustment - as often occurs - the benefits of diversification have been temporarily muted.
[Chart]
The last monthly update reported results of some initial simplified modelling on the impact of a hypothetical large fall in equity markets on the portfolio.
Currently, the actual asset price falls look to register in between the normal 'bear market', and the more extreme 'Global Financial Crisis Mark II' scenarios modelled. Absent, at least for the immediate phase, is a significant diversification offset - outside of a small (4 per cent) increase in the value of gold.
The continued sharp equity market losses have left the portfolio below its target Australian equity weighting, so contributions this month have been made to Vanguard's Australian shares ETF (VAS). This coming month will see quarterly distributions paid for the A200, VGS and VAS exchange traded funds - totalling around $2700 - meaning a further small opportunity to reinvest following sizeable market falls.
Reviewing the evidence on the history of stock market falls
Vladimir Lenin once remarked that there are decades where nothing happen, and then there are weeks in which decades happen. This month has been four such weeks in a row, from initial market responses to the coronavirus pandemic, to unprecedented fiscal and monetary policy responses aimed at lessening the impact.
Given this, it would be foolish to rule out the potential for other extreme steps that governments have undertaken on multiple occasions before. These could include underwriting of banks and other debt liabilities, effective nationalisation or rescues of critical industries or providers, or even temporary closure of some financial or equity markets.
There is a strong appeal for comforting narratives in this highly fluid investment environment, including concepts such as buying while distress selling appears to be occurring, or delaying investing until issues become 'more clear'.
Nobody can guarantee that investments made now will not be made into cruel short-lived bear market rallies, and no formulas exist that will safely and certainly minimise either further losses, or opportunities forgone. Much financial independence focused advice in the early stages of recent market falls focused on investment commonplaces, with a strong flavour of enthusiasm at the potential for 'buying the dip'.
Yet such commonly repeated truths turn out to be imperfect and conditional in practice. One of the most influential studies of a large sample of historical market falls turns out to provide mixed evidence that buying following a fall reliably pays off. This study (pdf) examines 101 stock market declines across four centuries of data, and finds that:
Even these findings should be viewed as simply indicative. Each crisis and economic phase has its unique character, usually only discernible in retrospect. History, in these cases, should inform around the potential outlines of events that can be considered possible. As the saying goes, risk is what remains after you believe you have thought of everything.
Position fixing - alternative perspectives of progress
In challenging times it can help to keep a steady view of progress from a range of perspectives. Extreme market volatility and large falls can be disquieting for both recent investors and those closer to the end of the journey.
One perspective on what has occurred is that the portfolio has effectively been pushed backwards in time. That is, the portfolio now sits at levels it last occupied in April 2019. Even this perspective has some benefit, highlighting that by this metric all that has been lost is the strong forward progress made in a relatively short time.
Yet each perspective can hide and distort key underlying truths.
As an example, while the overall portfolio is currently valued at around the same dollar value as a year ago, it is not the same portfolio. Through new purchases and reinvestments in this period, many more actual securities (mostly units in ETFs) have been purchased.
The chart below sets out the growth in total units held from January 2019 to this month, across the three major exchange trade funds holdings in the portfolio.
[Chart]
From this it can be seen that the number of securities held - effectively, individual claims on the future earnings of the firms in these indexes - has more than doubled over the past fifteen months. Through this perspective, the accumulation of valuable assets shows a far more constant path.
Though this can help illuminate progress, as a measure it also has limitations. The realities of falls in market values cannot be elided by such devices, and some proportion of those market falls represent initial reassessments of the likely course of future earnings, and therefore the fundamental value of each of those ETF units.
With significant uncertainty over the course of global lock-downs, trade and growth, the basis of these reassessments may provide accurate, or not. For anyone to discount all of these reassessments as wholly the temporary result of irrational panic is to show a remarkable confidence in one's own analytical capacities.
Similarly, it would be equally wrong to extrapolate from market falls to a permanent constraining of the impulse of humanity to innovate, adjust to changed conditions, seek out opportunities and serve others for profit.
Lines of position - Trends in expenditure
A further longer-term perspective regularly reviewed is monthly expenses compared to average distributions.
Monthly expenditure continues to be below average, and is likely to fall further next month as a natural result of a virus-induced reduction of shopping trips, events and outings.
[Chart]
As occurred last month, as a function some previous high distributions gradually falling outside of the data 'window' for the rolling three-year comparison of distributions and expenditure, a downward slope in distributions continues.
Progress
Progress against the objective, and the additional measures I have reached is set out below.
Measure Portfolio All Assets Portfolio objective – $2 180 000 (or $87 000 pa) 71.9% 97.7% Credit card purchases – $71 000 pa 87.7% 119.2% Total expenses – $89 000 pa 70.2% 95.5%
Summary
This month has been one of the most surprising and volatile of the entire journey, with significant daily movements in portfolio value and historic market developments. There has been more to watch and observe than at any time in living memory.
The dominant sensation has been that of travelling backwards through time, and revisiting a stage of the journey already passed. The progress of the last few months has actually been so rapid, that this backwards travel has felt less like a set back, but rather more like a temporary revisitation of days past.
It is unclear how temporary a revisitation current conditions will enforce, or exactly how this will affect the rest of the journey. In early January I estimated that if equity market fell by 33 per cent through early 2020 with no offsetting gains in other portfolio elements, this could push out the achievement of the target to January 2023.
Even so, experiencing these markets and with more volatility likely, I don't feel there is much value in seeking to rapidly recalculate the path from here, or immediately alter the targeted timeframe. Moving past the portfolio target from here in around a year looks almost impossibly challenging, but time exists to allow this fact to settle. Too many other, more important, human and historical events are still playing out.
In such times, taking diverse perspectives on the same facts is important. This Next Life recently produced this interesting meditation on the future of FIRE during this phase of economic hardship. In addition, the Animal Spirits podcast also provided a thoughtful perspective on current market falls compared to 2008, as does this article by Early Retirement Now. Such analysis, and each passing day, highlights that the murmurs of the sea are louder than ever before, reminding us of the precariousness of all things.
The post, links and full charts can be seen here.
submitted by thefiexpl to fiaustralia [link] [comments]

[FOR HIRE] Search Engine Optimization (SEO) EXPERT • Fast Loading Wordpress Sites Done by Web Designer With 12 Years Experience

Here is what I offer (WEB DESIGN)
Here is what I offer for SEO
Please enter:
Nashville gates
On Google, (or use this link: https://www.google.com/search?q=Nashville+gates), and you will see the site "accessgatesystems.com" at the top spot, first page, perhaps second place. If you know how to look at the code of a web page (Ctrl+U in your web browser), you will find my name (Steve B) in the code of the home page of accessgatesystems.com.
Prices for affordable website design are as follows:
Price for SEO
Portfolio:
submitted by Llolaila to forhire [link] [comments]

[Q] VAR and Impulse Response Analysis correctly applyed on Cryptocurrency Trading Rates? I have several questions that came up during the process

Hey everyone,
I want to look deeper into how cryptocurrencies influence each other and set up a VAR Model with R including the daily closing prices of Bitcoin, Ethereum, Ripple and CRIX (Crypto Index).
https://imgur.com/a/DZxfnYM
I use diff() and log() to make them stationary which is confirmed by ADF- and KPSS-Tests.
https://imgur.com/a/QtjGOLy
Now I need to select the lag p.
lag.p<-VARselect(all.ts.s, lag.max = 100, type = "none", season = NULL, exogen = NULL)
which gives me:
$selection 
AIC(n) HQ(n) SC(n) FPE(n) 5 3 2 5
Since SC has a better performance on large sample sizes I first off go for p=2 and compute the model:
all.ts.s.model <- VAR(all.ts.s, p = 2, type = "none")
This gives me plots like this one for Bitcoin:
https://imgur.com/a/xMem4C7
[Q1] Now here ist my first question. How does the VAR function select the lag parameters? Is it based on the PACF Residuals or does it just go for t-1 and t-2 if we have p=2?
And the following summary:
VAR Estimation Results: ========================= Endogenous variables: btc, eth, xrp, crix Deterministic variables: none Sample size: 1743 Log Likelihood: 12001.507 Roots of the characteristic polynomial: 0.4198 0.4198 0.359 0.3513 0.1483 0.1483 0.1307 0.109 Call: VAR(y = all.ts.s, p = 2, type = "none") Estimation results for equation btc: ==================================== btc = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 -0.0170752 0.0316917 -0.539 0.59010 eth.l1 -0.0242814 0.0179725 -1.351 0.17686 xrp.l1 -0.0149919 0.0158113 -0.948 0.34317 crix.l1 0.0689235 0.0349270 1.973 0.04861 * btc.l2 -0.0472973 0.0359361 -1.316 0.18830 eth.l2 0.0008513 0.0158264 0.054 0.95711 xrp.l2 0.0412428 0.0158348 2.605 0.00928 ** crix.l2 0.0150603 0.0280623 0.537 0.59156 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.04045 on 1735 degrees of freedom Multiple R-Squared: 0.0101, Adjusted R-squared: 0.005539 F-statistic: 2.213 on 8 and 1735 DF, p-value: 0.02402 Estimation results for equation eth: ==================================== eth = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 -0.144872 0.049997 -2.898 0.00381 ** eth.l1 0.067479 0.028354 2.380 0.01742 * xrp.l1 -0.040615 0.024944 -1.628 0.10365 crix.l1 0.169604 0.055101 3.078 0.00212 ** btc.l2 -0.081417 0.056693 -1.436 0.15116 eth.l2 0.007989 0.024968 0.320 0.74904 xrp.l2 0.007146 0.024981 0.286 0.77486 crix.l2 0.066981 0.044271 1.513 0.13047 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.06381 on 1735 degrees of freedom Multiple R-Squared: 0.01235, Adjusted R-squared: 0.007797 F-statistic: 2.712 on 8 and 1735 DF, p-value: 0.005714 Estimation results for equation xrp: ==================================== xrp = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 -0.153670 0.052720 -2.915 0.0036 ** eth.l1 0.060731 0.029898 2.031 0.0424 * xrp.l1 -0.013039 0.026303 -0.496 0.6201 crix.l1 0.099222 0.058102 1.708 0.0879 . btc.l2 -0.081178 0.059781 -1.358 0.1747 eth.l2 0.004083 0.026328 0.155 0.8768 xrp.l2 0.116093 0.026342 4.407 1.11e-05 *** crix.l2 0.047678 0.046683 1.021 0.3072 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.06729 on 1735 degrees of freedom Multiple R-Squared: 0.02133, Adjusted R-squared: 0.01682 F-statistic: 4.728 on 8 and 1735 DF, p-value: 9.388e-06 Estimation results for equation crix: ===================================== crix = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 0.63017 0.02426 25.976 < 2e-16 *** eth.l1 0.08161 0.01376 5.932 3.60e-09 *** xrp.l1 0.06747 0.01210 5.575 2.87e-08 *** crix.l1 -0.50161 0.02674 -18.762 < 2e-16 *** btc.l2 0.21440 0.02751 7.794 1.11e-14 *** eth.l2 0.03057 0.01211 2.524 0.0117 * xrp.l2 0.09026 0.01212 7.447 1.50e-13 *** crix.l2 -0.12353 0.02148 -5.751 1.05e-08 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.03096 on 1735 degrees of freedom Multiple R-Squared: 0.4247, Adjusted R-squared: 0.4221 F-statistic: 160.1 on 8 and 1735 DF, p-value: < 2.2e-16 Covariance matrix of residuals: btc eth xrp crix btc 0.0016321 0.0013076 0.0010312 0.0006704 eth 0.0013076 0.0040632 0.0014506 0.0006398 xrp 0.0010312 0.0014506 0.0045256 0.0005204 crix 0.0006704 0.0006398 0.0005204 0.0009570 Correlation matrix of residuals: btc eth xrp crix btc 1.0000 0.5078 0.3794 0.5364 eth 0.5078 1.0000 0.3383 0.3245 xrp 0.3794 0.3383 1.0000 0.2500 crix 0.5364 0.3245 0.2500 1.0000 
As we can see only the crix estimation has a 'good' fit.
[Q2] Can I remove parameters that are not significant from this to improve the estimation? Can you think of something else for further improvements?
Now I start with the Impulse Response Analysis.
all.ir <- irf(all.ts.s.model, n.ahead = 8, ortho = FALSE, runs = 1000)
This gives us the following for Bitcoin:
https://imgur.com/a/UHqlinI
[Q3] How do I interpret this correctly? Since we "diff-logged" the time series in the beginnig, do we have to undo this step? As I see it: A poitive change of one diff-logged standard deviation of bitcoin would result in a change of 0.6 diff-logged standard deviation of the crix index after one day. Is that correct?
[Q4] Now the following is something I apparently need to do but haven't really understood why and if the Impulse response we did previously is invalid?
Variance-covariance matrix:
 btc eth xrp crix btc 0.0016320844 0.001307606 0.0010312328 0.0006703629 eth 0.0013076059 0.004063220 0.0014505695 0.0006398030 xrp 0.0010312328 0.001450569 0.0045256047 0.0005203659 crix 0.0006703629 0.000639803 0.0005203659 0.0009569768 
Since the off-diagonal elements of the estimated variance-covariance matrix are not zero, we can assume that there is contemporaneous correlation between the variables in the VAR model. This is confirmed by the correlation matrix, which corresponds to:
 btc eth xrp crix btc 1.0000000 0.5077740 0.3794435 0.5363991 eth 0.5077740 1.0000000 0.3382712 0.3244595 xrp 0.3794435 0.3382712 1.0000000 0.2500460 crix 0.5363991 0.3244595 0.2500460 1.0000000 
Therefore we need to decompose the variance-covariance matrix to a lower triangular matrix with positve diagonal elements:
> t(chol(all.ts.s.model_summary$covres)) btc eth xrp crix btc 0.04039906 0.000000000 0.000000000 0.00000000 eth 0.03236723 0.054914313 0.000000000 0.00000000 xrp 0.02552615 0.011369686 0.061194366 0.00000000 crix 0.01659353 0.001870487 0.001234266 0.02601172 
[Q5] This is done automaticly if we set ortho=TRUE within the ifr function, is that correct?
all.oir <- irf(all.ts.s.model, n.ahead = 8, ortho = TRUE, runs = 1000, seed = 12345)
which gives us for bitcoin:
https://imgur.com/a/jCEeIBP
[Q6] Now the effect on the crix index after one day is just 0.02. Thats a massive difference and I wonder how to deal with this.

What do you guys think? Is my method overall correct? I'm gratefull for any advice! Cheers and all the best
Anton
submitted by anton_b_j to rstats [link] [comments]

[For Hire] Hire a Professional Web Designer with Wordpress & Html Skills

I'm a professional web designer with knowledge in Wordpress, HTML, PHP, JavaScript, Python (bot creator) & SEO Expert
Get a professional Website Setup and configured starting @ $299 only.
Price depends on the project and functionality of the site or service you need.
We make:
*. Website & Bot Samples Available! * Paypal & Bitcoin Acceptable
Add me on Skype at kanonig let's discuss your next project.
submitted by kanonig to Jobs4Bitcoins [link] [comments]

[FOR HIRE] Fast Loading Wordpress Sites Done by Web Designer With 12 Years Experience • Search Engine Optimization (SEO) EXPERT

Here is what I offer (WEB DESIGN)
Here is what I offer for SEO
Please enter:
Nashville gates
On Google, (or use this link: https://www.google.com/search?q=Nashville+gates), and you will see the site "accessgatesystems.com" at the top spot, first page, perhaps second place. If you know how to look at the code of a web page (Ctrl+U in your web browser), you will find my name (Steve B) in the code of the home page of accessgatesystems.com.
Prices for affordable website design are as follows:
Price for SEO
Portfolio:
submitted by Llolaila to forhire [link] [comments]

[Q] VAR and Impulse Response Analysis correctly applyed on Cryptocurrency Trading Rates? I have several questions that came up during the process

Hey everyone,
I want to look deeper into how cryptocurrencies influence each other and set up a VAR Model with R including the daily closing prices of Bitcoin, Ethereum, Ripple and CRIX (Crypto Index).
https://imgur.com/a/DZxfnYM
I use diff() and log() to make them stationary which is confirmed by ADF- and KPSS-Tests.
https://imgur.com/a/QtjGOLy
Now I need to select the lag p.
lag.p<-VARselect(all.ts.s, lag.max = 100, type = "none", season = NULL, exogen = NULL)
which gives me:
$selection AIC(n) HQ(n) SC(n) FPE(n) 5 3 2 5 
Since SC has a better performance on large sample sizes I first off go for p=2 and compute the model:
all.ts.s.model <- VAR(all.ts.s, p = 2, type = "none")
This gives me plots like this one for Bitcoin:
https://imgur.com/a/xMem4C7
[Q1] Now here ist my first question. How does the VAR function select the lag parameters? Is it based on the PACF Residuals or does it just go for t-1 and t-2 if we have p=2?
And the following summary:
VAR Estimation Results: ========================= Endogenous variables: btc, eth, xrp, crix Deterministic variables: none Sample size: 1743 Log Likelihood: 12001.507 Roots of the characteristic polynomial: 0.4198 0.4198 0.359 0.3513 0.1483 0.1483 0.1307 0.109 Call: VAR(y = all.ts.s, p = 2, type = "none") Estimation results for equation btc: ==================================== btc = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 -0.0170752 0.0316917 -0.539 0.59010 eth.l1 -0.0242814 0.0179725 -1.351 0.17686 xrp.l1 -0.0149919 0.0158113 -0.948 0.34317 crix.l1 0.0689235 0.0349270 1.973 0.04861 * btc.l2 -0.0472973 0.0359361 -1.316 0.18830 eth.l2 0.0008513 0.0158264 0.054 0.95711 xrp.l2 0.0412428 0.0158348 2.605 0.00928 ** crix.l2 0.0150603 0.0280623 0.537 0.59156 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.04045 on 1735 degrees of freedom Multiple R-Squared: 0.0101, Adjusted R-squared: 0.005539 F-statistic: 2.213 on 8 and 1735 DF, p-value: 0.02402 Estimation results for equation eth: ==================================== eth = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 -0.144872 0.049997 -2.898 0.00381 ** eth.l1 0.067479 0.028354 2.380 0.01742 * xrp.l1 -0.040615 0.024944 -1.628 0.10365 crix.l1 0.169604 0.055101 3.078 0.00212 ** btc.l2 -0.081417 0.056693 -1.436 0.15116 eth.l2 0.007989 0.024968 0.320 0.74904 xrp.l2 0.007146 0.024981 0.286 0.77486 crix.l2 0.066981 0.044271 1.513 0.13047 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.06381 on 1735 degrees of freedom Multiple R-Squared: 0.01235, Adjusted R-squared: 0.007797 F-statistic: 2.712 on 8 and 1735 DF, p-value: 0.005714 Estimation results for equation xrp: ==================================== xrp = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 -0.153670 0.052720 -2.915 0.0036 ** eth.l1 0.060731 0.029898 2.031 0.0424 * xrp.l1 -0.013039 0.026303 -0.496 0.6201 crix.l1 0.099222 0.058102 1.708 0.0879 . btc.l2 -0.081178 0.059781 -1.358 0.1747 eth.l2 0.004083 0.026328 0.155 0.8768 xrp.l2 0.116093 0.026342 4.407 1.11e-05 *** crix.l2 0.047678 0.046683 1.021 0.3072 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.06729 on 1735 degrees of freedom Multiple R-Squared: 0.02133, Adjusted R-squared: 0.01682 F-statistic: 4.728 on 8 and 1735 DF, p-value: 9.388e-06 Estimation results for equation crix: ===================================== crix = btc.l1 + eth.l1 + xrp.l1 + crix.l1 + btc.l2 + eth.l2 + xrp.l2 + crix.l2 Estimate Std. Error t value Pr(>|t|) btc.l1 0.63017 0.02426 25.976 < 2e-16 *** eth.l1 0.08161 0.01376 5.932 3.60e-09 *** xrp.l1 0.06747 0.01210 5.575 2.87e-08 *** crix.l1 -0.50161 0.02674 -18.762 < 2e-16 *** btc.l2 0.21440 0.02751 7.794 1.11e-14 *** eth.l2 0.03057 0.01211 2.524 0.0117 * xrp.l2 0.09026 0.01212 7.447 1.50e-13 *** crix.l2 -0.12353 0.02148 -5.751 1.05e-08 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.03096 on 1735 degrees of freedom Multiple R-Squared: 0.4247, Adjusted R-squared: 0.4221 F-statistic: 160.1 on 8 and 1735 DF, p-value: < 2.2e-16 Covariance matrix of residuals: btc eth xrp crix btc 0.0016321 0.0013076 0.0010312 0.0006704 eth 0.0013076 0.0040632 0.0014506 0.0006398 xrp 0.0010312 0.0014506 0.0045256 0.0005204 crix 0.0006704 0.0006398 0.0005204 0.0009570 Correlation matrix of residuals: btc eth xrp crix btc 1.0000 0.5078 0.3794 0.5364 eth 0.5078 1.0000 0.3383 0.3245 xrp 0.3794 0.3383 1.0000 0.2500 crix 0.5364 0.3245 0.2500 1.0000 
As we can see only the crix estimation has a 'good' fit.
[Q2] Can I remove parameters that are not significant from this to improve the estimation? Can you think of something else for further improvements?
Now I start with the Impulse Response Analysis.
all.ir <- irf(all.ts.s.model, n.ahead = 8, ortho = FALSE, runs = 1000)
This gives us the following for Bitcoin:
https://imgur.com/a/UHqlinI
[Q3] How do I interpret this correctly? Since we "diff-logged" the time series in the beginnig, do we have to undo this step? As I see it: A poitive change of one diff-logged standard deviation of bitcoin would result in a change of 0.6 diff-logged standard deviation of the crix index after one day. Is that correct?
[Q4] Now the following is something I apparently need to do but haven't really understood why and if the Impulse response we did previously is invalid?
Variance-covariance matrix:
 btc eth xrp crix btc 0.0016320844 0.001307606 0.0010312328 0.0006703629 eth 0.0013076059 0.004063220 0.0014505695 0.0006398030 xrp 0.0010312328 0.001450569 0.0045256047 0.0005203659 crix 0.0006703629 0.000639803 0.0005203659 0.0009569768 
Since the off-diagonal elements of the estimated variance-covariance matrix are not zero, we can assume that there is contemporaneous correlation between the variables in the VAR model. This is confirmed by the correlation matrix, which corresponds to:
 btc eth xrp crix btc 1.0000000 0.5077740 0.3794435 0.5363991 eth 0.5077740 1.0000000 0.3382712 0.3244595 xrp 0.3794435 0.3382712 1.0000000 0.2500460 crix 0.5363991 0.3244595 0.2500460 1.0000000 
Therefore we need to decompose the variance-covariance matrix to a lower triangular matrix with positve diagonal elements:
> t(chol(all.ts.s.model_summary$covres)) btc eth xrp crix btc 0.04039906 0.000000000 0.000000000 0.00000000 eth 0.03236723 0.054914313 0.000000000 0.00000000 xrp 0.02552615 0.011369686 0.061194366 0.00000000 crix 0.01659353 0.001870487 0.001234266 0.02601172 
[Q5] This is done automaticly if we set ortho=TRUE within the ifr function, is that correct?
all.oir <- irf(all.ts.s.model, n.ahead = 8, ortho = TRUE, runs = 1000, seed = 12345)
which gives us for bitcoin:
https://imgur.com/a/jCEeIBP
[Q6] Now the effect on the crix index after one day is just 0.02. Thats a massive difference and I wonder how to deal with this.
What do you guys think? Is my method overall correct? I'm gratefull for any advice! Cheers and all the best
Anton
submitted by anton_b_j to AskStatistics [link] [comments]

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Why coin staking will be added in Ethereum 2.0

A brief educational program for those who do not follow the update of the project of Vitalik Buterin. Ethereum has long been in need of updating, and the main problem of the network is scalability: the blockchain is overloaded, transactions are slowing down, and the cost of “gas” (transaction fees) is growing. If you do not update the consensus algorithm, then the network will someday cease to be operational. To avoid this, developers have been working for several years on moving the network from the PoW algorithm to state 2.0, running on PoS. This should make the network more scalable, faster and cheaper. In December last year, the first upgrade phase, Istanbul, was implemented in the network, and in April of this year, the Topaz test network with the possibility of staking was launched - the first users already earned 1%. In the PoS algorithm that Ethereum switches to, there is no mining, and validation occurs due to the delegation of user network coins to the masternodes. For the duration of the delegation, these coins are frozen, and for providing their funds for block validation, users receive a portion of the reward. This is staking - such a crypto-analogue of a bank deposit. There are several types of staking: with income from dividends or masternodes, but not the device’s power, as in PoW algorithms, but the number of miner coins is important in all of them. The more coins, the higher the income. For crypto investors, staking is an opportunity to receive passive income from blocked coins. It is assumed that the launch of staking:
  • Will make ETH mining more affordable, but less resource intensive;
  • Will make the network more secure and secure - attacks will become too expensive;
  • Will create an entirely new sector of steak infrastructure around the platform;
  • Provides increased scalability, which will create the opportunity for wider implementation of DeFi protocols;
  • And, most importantly, it will show that Ethereum is a developing project.

The first payments to stakeholders will be one to two years after the launch of the update

The minimum validator steak will be 32 ETN (≈$6092 for today). This is the minimum number of coins that an ETH holder must freeze in order to qualify for payments. Another prerequisite is not to disconnect your wallet from the network. If the user disconnects and goes into automatic mode, he loses his daily income. If at some point the steak drops below 16 ETH, the user will be deprived of the right to be a validator. The Ethereum network has to go through many more important stages before coin holders can make money on its storage. Collin Myers, the leader of the product strategy at the startup of the Ethereum developer ConsenSys, said that the genesis block of the new network will not be mined until the total amount of frozen funds reaches 524,000 ETN ($99.76 million at the time of publication). So many coins should be kept by 16,375 validators with a minimum deposit of 32 ETN. Until this moment, none of them will receive a percentage profit. Myers noted that this event is not tied to a clear time and depends on the activity of the community. All validators will have to freeze a rather significant amount for an indefinite period in the new network without confidence in the growth of the coin rate. It’s hard to say how many people there are. The developers believe that it will take 12−18 or even 24 months. According to the latest ConsenSys Codefi report, more than 65% of the 300 ETH owners surveyed plan to use the staking opportunity. This sample, of course, is not representative, but it can be assumed that most major coin holders will still be willing to take a chance.

How much can you earn on Ethereum staking

Developers have been arguing for a long time about what profitability should be among the validators of the Ethereum 2.0 network. The economic model of the network maintains an inflation rate below 1% and dynamically adjusts the reward scale for validators. The difficulty is not to overpay, but not to pay too little. Profitability will be variable, as it depends on the number and size of steaks, as well as other parameters. The fewer frozen coins and validators, the higher the yield, and vice versa. This is an easy way to motivate users to freeze ETN. According to the October calculations of Collin Myers, after the launch of Ethereum 2.0, validators will be able to receive from 4.6% to 10.3% per annum as a reward for their steak. At the summit, he clarified that the first time after the launch of the Genesis block, it can even reach 20.3%. But as the number of steaks grows, profitability will decline. So, with five million steaks, it drops to about 6.6%. The above numbers are not net returns. They do not include equipment and electricity costs. According to Myers, after the Genesis block, the costs of maintaining the validator node will be about 4.75% of the remuneration. They will continue to increase as the number of blocked coins increases, and with a five millionth steak, they will grow to about 14.7%. Myers emphasized that profitability will be higher for those who will work on their own equipment, rather than relying on cloud services. The latter, according to his calculations, at current prices can bring a loss of up to minus 15% per year. This, he believes, promotes true decentralization. At the end of April, Vitalik Buterin said that validators will be able to earn 5% per annum with a minimum stake of 32 ETH - 1.6 ETH per year, or $ 304 at the time of publication. However, given the cost of freezing funds, the real return will be at 0.8%.

How to calculate profitability from ETN staking

The easiest way to calculate the estimated return for Ethereum staking is to use a special calculator. For example, from the online services EthereumPrice or Stakingrewards. The service takes into account the latest indicators of network profitability, as well as additional characteristics: the time of operation of a node in the network, the price of a coin, the share of blocked ETNs and so on. Depending on these values, the profit of the validator can vary greatly. For example, you block 32 ETNs at today's coin price - $190, 1% of the coins are blocked, and the node works 99% of the time. According to the EthereumPrice calculator, in this case your yield will be 14.25% per annum, or 4.56 ETH.
Validator earnings from the example above for 10 years according to EthereumPrice.
If to change the data, you have the same steak, but the proportion of blocked coins is 10%. Now your annual yield is only 4.51%, or 1.44 ETH.
Validator earnings from the second example over 10 years according to EthereumPrice.
It is important that this is profitability excluding expenses. Real returns will be significantly lower and in the second case may be negative. In addition, you must consider the fluctuation of the course. Even with a yield of 14% per annum in ETN, dollar-denominated returns may be negative in a bear market.

When will the transition to Ethereum 2.0 start

Ben Edgington from Teku, the operator of Ethereum 2.0, at the last summit said that the transition to PoS could be launched in July this year. These deadlines, if there are no new delays, were also mentioned by experts of the BitMEX crypto exchange in their recent report on the transition of the Ethereum ecosystem to stage 2.0. However, on May 12, Vitalik Buterin denied the possibility of launching Ethereum 2.0 in July. The network is not yet ready and is unlikely to be launched before the end of the year. July 30 marks the 5th anniversary of the launch of Ethereum. Unfortunately, it seems that it will not be possible to start the update for the anniversary again. Full deployment of updates will consist of several stages. Phase 0. Beacon chain. The "zero" phase, which can be launched in July this year. In fact, it will only be a network test and PoS testing without economic activity, but it will use new ETN coins and the possibility of staking will appear. The "zero" phase will test the first layer of Ethereum 2.0 architecture - Lighthouse. This is the Ethereum 2.0 client in Rust, developed back in 2018. Phase 1. Sharding - rejection of full nodes in favor of load balancing between all network nodes (shards). This should increase network bandwidth and solve the scalability problem. This is the first full phase of Ethereum 2.0. It will initially be deployed with 64 shards. It is because of sharding that the transition of a network to a new state is so complicated - existing smart contracts cannot be transferred to a new network. Therefore, at first, perhaps several years, both networks will exist simultaneously. Phase 2. State execution. In this phase, various applications will work, and it will be possible to conclude smart contracts. This is a full-fledged working Ethereum 2.0 network. After the second phase, two networks will work in parallel - Ethereum and Ethereum 2.0. Coin holders will be able to transfer ETN from the first to the second without the ability to transfer them back. To stimulate network support, coin emissions in both networks will increase until they merge. Read more about the phases of transition to state 2.0 in the aforementioned BitMEX report.

How the upgrade to Ethereum 2.0 will affect the staking market and coin price

The transition of the second largest coin to PoS will dramatically increase the stake in the market. The deposit in 32 ETH is too large for most users. Therefore, we should expect an increase in offers for staking from the exchanges. So, the launch of such a service in November was announced by the largest Swiss crypto exchange Bitcoin Suisse. She will not have a minimum deposit, and the commission will be 15%. According to October estimates by Binance Research analysts, the transition of Ethereum to stage 2.0 can double the price of a coin and the stake of staking in the market, and it will also make ETH the most popular currency on the PoS algorithm. Adam Cochran, partner at MetaCartel Ventures DAO and developer of DuckDuckGo, argued in his blog that Ethereum's transition to state 2.0 would be the “biggest event” of the cryptocurrency market. He believes that a 3–5% return will attract the capital of large investors, and fear of lost profit (FOMO) among retail investors will push them to actively buy coins. The planned coin burning mechanism for each transaction will reduce the potential oversupply. However, BitMEX experts in the report mentioned above believe that updating the network will not be as important an event as it seems to many, and will not have a significant impact on the coin rate and the staking market. Initially, this will be more likely to test the PoS system, rather than a full-fledged network. There will be no economic activity and smart contracts, and interest for a steak will not be paid immediately. Therefore, most of the economic activity will continue to be concluded in the original Ethereum network, which will work in parallel with the new one. Analysts of the exchange emphasized that due to the addition of staking, the first time (short, in their opinion) a large number of ETNs will be blocked on the network. Most likely, this will limit the supply of coins and lead to higher prices. However, this can also release some of the ETNs blocked in smart contracts, and then the price will not rise. Moreover, the authors of the document are not sure that the demand for coins will be long-term and stable. For this to happen, PoS and sharding must prove that they work stably and provide the benefits for which the update was started. But, if this happens, the network is waiting for a wave of coins from the developers of smart contracts and DeFi protocols. In any case, quick changes should not be expected. A full transition to Ethereum 2.0 will take years and won’t be smooth - network failures are inevitable. We also believe that we should not rely on Ethereum staking as another panacea for all the problems of the coin and the market. Most likely, the transition of the network to PoS will not have a significant impact on the staking market, but may positively affect the price of the coin. However, relying on the ETN rally in anticipation of this is too optimistic.
Subscribe to our Telegram channel
submitted by Smart_Smell to Robopay [link] [comments]

Bull Bitcoin’s Dollar-Cost Averaging tool for Canadians: a detailed overview

Hello fellow Canadian Bitcoiners!
I'm Francis Pouliot, CEO and founder of Bull Bitcoin (previously known as Bitcoin Outlet) and Bylls.
I haven't been active on Reddit for a while but I thought I'd pop back here to let the community know about our new dollar-cost averaging feature, "Recurring Buy"
This post is a copy of my most recent medium article which you can read here if you want to see the screenshots. https://medium.com/bull-bitcoin/bull-bitcoins-dollar-cost-averaging-tool-for-canadians-the-right-time-to-buy-bitcoin-is-every-day-82a992ca22c1
Thanks in advance for any feedback and suggestions!
[Post starts here]
The Bull Bitcoin team is constantly trying to reduce the frictions ordinary people face when investing in Bitcoin and propose innovative features which ensure our users follow Bitcoin best practices and minimize their risks.
We are particularly excited and proud about our latest feature: an automated Bitcoin dollar-cost averaging tool which we dubbed “Recurring Buy”.
The Recurring Buy feature lets Bull Bitcoin users create an automated schedule that will buy Bitcoin every day using the funds in their account balance and send the Bitcoin directly to their Bitcoin wallet straight away.
We put a lot of thought in the implementation details and striking the right trade-offs for a simple and elegant solution. Our hope is that it will become a standard other Bitcoin exchanges will emulate for the benefit of their users. This standard will certainly evolve over time as we accumulate feedback and operational experience.
In this article, I cover:
The problem that we are trying to solve
Recurring Buy feature details, processes and instructions
The rationale (and tradeoffs) behind the main feature design choices
Bull Bitcoin is only available to Canadians, but non-Canadians that wish to have a look at how it works are welcome to make a Bull Bitcoin account and check out how it works here. You will be able to go through the process of create the schedule for testing purposes, but you wont be able to fund your account and actually purchase Bitcoin.
What problems does Dollar-Cost Averaging solve?
The most common concern of Bitcoin investors is, not surprisingly, “when is the right time to buy Bitcoin?”. Bitcoin is indeed a very volatile asset. A quick glance at a Bitcoin price chart shows there are without a doubt “worse times” and “better times” to invest in Bitcoin. But is that the same as the “right” time?
Gurus, analysts and journalists continuously offer their theories explaining what affects the Bitcoin price, supported by fancy trading charts and geopolitical analysis, further reinforcing the false notion that it is possible to predict the price of Bitcoin.
Newbies are constantly bombarded with mainstream media headlines of spectacular gains and devastating losses. For some, this grows into an irresistible temptation to get rich quick. Others become crippled with the fear of becoming “the sucker” on which early adopters dump their bags.
Veterans are haunted by past Bitcoin purchases which were quickly followed by a crash in the price. “I should have waited to buy the dip…”
Many Bitcoin veterans and long-term investors often shrug off the question of when is the right time to buy with the philosophy: “just hodl”. But even those holding until their death will recognize that buying more Bitcoin for the same price is a better outcome.
Given the very high daily volatility of Bitcoin, a hodler can find himself in many years having significantly less wealth just because he once bought Bitcoin on a Monday instead of a Wednesday. His options are either to leave it up to chance or make an attempt to “time the market” and “buy the dip”, which can turn into a stressful trading obsession, irrational decisions (which have a negative impact on budget, income and expenses) and severe psychological trauma. In addition, trying to “buy the dip” is often synonymous to keeping large amounts of fiat on an exchange to be ready for “when the time comes”.
There must be a better way.
Bitcoin investors should be rewarded for having understood Bitcoin’s long-term value proposition early on, for having taken the risk to invest accordingly and for having followed best practices. Not for being lucky.
Overview of features and rules
In this section I go into every detail of the Recurring Buy feature. In the following section, I focus on explaining why we chose this particular user experience.
The user first decides his target investment amount. Ideally, this is a monthly budget or yearly budget he allocates to investing in Bitcoin based on his projected income and expenses.
The user then chooses either the duration of the Recurring Buy schedule or the daily purchase amount. The longer the better.
The frequency is each day and cannot be modified.
The user must submit a Bitcoin address before activating a Recurring Buy schedule. By default, every transaction will be sent to that Bitcoin address. It’s the fallback address in case they don’t provide multiple addresses later.
Once the user has filled the form with target amount, the duration and the Bitcoin address, he can activate the Recurring Buy Schedule.
The user is not required to already have funds in his account balance to activate the schedule.
We will randomly select a time of day at which his transaction will be processed (every hour, so 24 possible times). If the user insists on another time of day, he can cancel his Recurring Buy schedule and try again.


The Recurring Buy feature as displayed on bullbitcoin.com/recurring-buys
The schedule is then displayed to the user, showing the time and date at which transactions that will take place in the future. The user will be able to see how long his current balance will last.
He can follow the progress of the dollar-cost averaging schedule, monitor in real time his average acquisition cost, and audit each transaction individually.
At this point, the user can and should change the Bitcoin address of his next transactions to avoid address re-use. Address re-use is not forbidden, but it is highly discouraged.
After having modified the Bitcoin addresses, there is nothing left for the user to do except watch the bitcoins appear in his Bitcoin wallet every day!
The Bitcoins are sent right away at the time of purchase.
Bitcoin transactions using the Recurring Buy feature will have the lowest possible Bitcoin network transaction fee to avoid creating upwards pressure on the fee market impact other network users.


What users see after first activating a schedule
The Recurring Buy schedule will be cancelled automatically at the time of the next purchase if the balance is insufficient. He can add more funds to his balance whenever he wants.
The Recurring Buy schedule will continue until the target amount is reached or until the account balance runs out.
The user can cancel his Recurring Buy schedule whenever he wants.
If the user wants to change the amount or duration of the schedule, he can simply cancel his current schedule and create a new one.
Each schedule has a unique identifier so that users can keep track of various schedules they perform over time.
Once a schedule is completed, either fully or partially, a summary will be provided which shows the number of transactions completed, the average acquisition cost, the total amount of Bitcoin purchase and the total amount of fiat spent. Useful for accounting!


A partially completed Recurring Buy schedule cancelled after 9 days due to insufficient funds
Though process in making our design choices
Recurring Bitcoin Purchases vs. Recurring Payment/Funding
The first and most important design choice was to separate the processes of funding the account balance with fiat (the payment) from the process of buying Bitcoin (the purchase). Users do not need to make a bank transaction every time they do a Bitcoin purchase. They first fund their account manually on their own terms, and the recurring purchases are debited from their pre-funded account balance.
Another approach would have been to automatically withdraw fiat from the user’s bank account (e.g. a direct debit or subscription billing) for each transaction (like our friends at Amber) or to instruct the user to set-up recurring payments to Bull Bitcoin from their bank account (like our friends at Bittr). The downside of these strategies is that they require numerous bank transactions which increases transaction fees and the likelihood of triggering fraud and compliance flags at the user’s bank. However, this does remove the user’s need to keep larger amounts of fiat on the exchange and reduces the friction of having to make manual bank payments.
Bull Bitcoin is currently working on a separate “Recurring Funding” feature that will automatically debit fiat from the user’s bank accounts using a separate recurring schedule with a minimum frequency of once a week, with a target of once every two weeks or once a month to match the user’s income frequency. This can, and will, be used in combination from the “Recurring Buy” feature, but both can be used separately.
The ultimate experience that we wish to achieve is that users will automatically set aside, each paycheck (two weeks), a small budget to invest in Bitcoin using the “Recurring Funding” feature which is sufficient to refill their account balance for the next two weeks of daily recurring purchases.
Frequency of transactions
The second important decision was about customizing the frequency of the schedule. We decided to make it “each day” only. This is specifically to ensure users have a large enough sample size and remain consistent which are the two key components to a successful dollar-cost averaging strategy.
A higher amount of recurring transactions (larger sample size) will result in the user’s average acquisition being closer to the actual average Bitcoin price over that period of time. Weekly or monthly recurring purchases can provide the same effectiveness if they are performed over a duration of time which is 7x longer (weekly) or 30x longer (monthly).
It is our belief that the longer the duration of the schedule, the more likely the user is to cancel the recurring buy schedule in order to “buy the dip”. Dollar-cost averaging is boring, and watching sats appear in the wallet every day is a good way to reduce the temptation of breaking the consistency.
We do not force this on users: they can still cancel the schedule if they want and go all-in. We consider it more of a gentle nudge in the right direction.
Frequency of withdrawals (one purchase = one bitcoin transaction)
This is one of the most interesting design choices because it is a trade-off between scalability (costs), privacy and custody. Ultimately, we decided that trust-minimization (no custody) and privacy were the most important at the expense of long-term scalability and costs.
Realistically, Bitcoin network fees are currently low and we expect them to remain low for the near future, although they will certainly increase massively over the long-term. One of the ways we mitigated this problem was to select the smallest possible transaction fee for transactions done in the context of Recurring Buy, separate from regular transaction fees on regular Bitcoin purchases (which, at Bull Bitcoin, are very generous).
Note: users must merge their UTXOs periodically to avoid being stuck with a large amount of small UTXOs in the future when fees become more expensive. This is what makes me most uncomfortable about our solution. I hope to also solve this problem, but it is ultimately something Bitcoin wallets need to address as well. Perhaps an automated tool in Bitcoin wallets which merges UTXOs periodically when the fees are low? Food for thought.
When transaction fees and scalability becomes a problem for us, it will have become a problem for all other small payments on the Bitcoin network, and we will use whatever solution is most appropriate at that time.
It is possible that Lightning Network ends up being the scalability solution, although currently it is logistically very difficult to perform automated payouts to users using Lightning, particularly recurring payouts, which require users to create Bolt11 invoices and to convince other peers in the network to open channels and fund channels with them for inbound capacity.
These are the general trade-offs:
Send a Bitcoin transaction for every purchase (what we do) - Most expensive for the exchange - Most expensive for the user (many UTXOs) - Increases Bitcoin Network UTXOs set - Inefficient usage of block space - Most private - Zero custody risk
Keep custody of the Bitcoin until the schedule is over or when the user requests a withdrawal (what Coinbase does) - No additional costs -No blockchain bloating - Same level of privacy - High custody risk
Batch user transactions together at fixed intervals (e.g. every day) - Slightly lower transaction costs for the exchange - Same costs for the user - Slightly more efficient use of block space - Same level of UTXO set bloating - Much lower level of privacy - Slightly higher custody risk
Single address vs multiple addresses vs HD keys (xpubs)
The final decision we had to make was preventing address re-use and allowing users to provide an HD key (xpub) rather than a Bitcoin address.
Address re-use generally decreases privacy because it becomes possible for third-party blockchain snoops to figure out that multiple Bitcoin transactions are going to the same user. But we must also consider that even transactions are sent to multiple addresses, particularly if they are small amounts, it is highly likely that the user will “merge” the coins into a single transaction when spending from his wallet. It is always possible for users to prevent this using Coinjoin, in which there is a large privacy gain in not re-using addresses compared to using a single address.
It is important to note that this does not decrease privacy compared to regular Bitcoin purchases on Bull Bitcoin outside of “Recurring Buy”. Whether a user has one transaction of $1000 going to a Bitcoin address or 10x$100 going that same Bitcoin address doesn’t reveal any new information about the user other than the fact he is likely using a dollar-cost averaging mechanism. It is rather a missed opportunity to gain more privacy.
Another smaller decision was whether or not we should ask the user to provide all his addresses upfront before being able to activate the schedule, which would completely remove the possibility of address re-use. We ultimately decided that because this process can take a very long time (imagine doing Recurring Buy every day for 365 days) it is better to let the user do this at his own pace, particularly because he may eventually change his Bitcoin wallet and forget to change the addresses in the schedule.
There are also various legitimate use-cases where users have no choice but to re-use the same address . A discussion for another day!
Asking the user to provide an XPUB is a great solution to address re-use. The exchange must dynamically derive a new Bitcoin address for the user at each transaction, which is not really a technical challenge. As far as I can tell, Bittr is the only Bitcoin exchange exchange which has implemented this technique. Kudos!
It is however important that the user doesn’t reuse this XPUB for anything else, otherwise the exchange can track his entire wallet balance and transaction history.
It is worth noting that not all wallets support HD keys or have HD keys by default (e.g. Bitcoin Core). So it is imperative that we offer the option to give Bitcoin addresses. We believe there is a lot of potential to create wallet coordination mechanisms between senders and recipients which would make this process a lot more streamlined.
In the future, we will certainly allow users to submit an XPUB instead of having to manually input a different address. But for now, we wanted to reduce the complexity to a minimum.
Conclusion: personal thoughts
I have a somewhat unique perspective on Bitcoin users due to the fact that I worked at the Bitcoin Embassy for almost 4 years. During this time, I had the opportunity to discuss face-to-face with thousands of Bitcoin investors. One of my favourite anecdotes is a nocoiner showing up at our office in December 2013 with a bag full of cash attempting to buy Bitcoin, “I know how to read a chart”, furious after being turned away. Many people who went “all-in” for short-term gains (usually altcoins) would show up to the Bitcoin Embassy office months later with heart-breaking stories.
This isn’t what I signed up for. My goal is to help people opt-out of fiat and, ultimately, to destroy the fiat currency system entirely.
This instilled in me a deep-rooted concern for gambling addiction and strong aversion to “trading”. I do not believe that Bitcoin exchanges should blindly follow “what the market dictates”. More often than not, what dictates the market is bad habits users formed because of the other Bitcoin services they used in the past, what other people are used to, and what feels familiar. Running a Bitcoin company should be inseparable from educating users on the best practices, and embedding these best practices into the user experience is the best way for them to learn.
Another important anecdote which motivated me to build a dollar-cost averaging tool is a person very close to me that had made the decision to buy Bitcoin, but was so stressed out about when was the right time to buy that they ended up not buying Bitcoin for a whole 6 months after funding their Bull Bitcoin account. That person eventually gave up and ultimately invested a large amount all at once. In hindsight, it turned out to be one of the worst possible times to invest in Bitcoin during that year.
Investing in Bitcoin can, and should be, a positive and rewarding experience.
Buying Bitcoin every day is the right strategy, but it is not necessarily lead to the best outcome.
The reality is that the best time to buy Bitcoin is at when market hits rock bottom (obviously). Sometimes, the upside from buying the dip can be much bigger than the risk (e.g. when the price dropped below $200 in 2015). But these are exceptions rather than the rule. And the cost of chasing dips is very high: stress, investing time and mental energy, and the very real psychological trauma which results from making bad trading decisions. Ultimately, it’s better to do the right thing than being lucky, but it’s not always a bad idea to cheat on your dollar-cost averaging from time to time if you can live with the costs and consequences.
Yours truly,
Francis
submitted by FrancisPouliot to BitcoinCA [link] [comments]

Quant Network: Token valuation dynamics and fundamentals

Quant Network: Token valuation dynamics and fundamentals
This post intends to illustrate the dynamics and fundamentals related to the mechanics and use of the Quant Network Utility Token (QNT), in order to provide the community with greater clarity around what holding the token actually means.
This is a follow-up on two articles David W previously wrote about Quant Network’s prospects and potential, which you can find here:
For holders not intending to use Overledger for business reasons, the primary goal of holding the QNT token is to benefit from price appreciation. Some are happy to believe that speculation will take the QNT price to much higher levels if and when large-scale adoption/implementation news comes out, whilst others may actually prefer to assess the token’s utility and analyse how it would react to various scenarios to justify a price increase based on fundamentals. The latter is precisely what I aim to look into in this article.
On that note, I have noticed that many wish to see institutional investors getting involved in the crypto space for their purchase power, but the one thing they would bring and that is most needed in my opinion is fundamental analysis and valuation expectations based on facts. Indeed, equity investors can probably access 20 or 30 reports that are 15 pages long and updated on a quarterly basis about any blue chip stock they are invested in, but how many of such (professional) analyst reports can you consult for your favorite crypto coins? Let me have a guess: none. This is unfortunate, and it is a further reason to look into the situation in more details.
To be clear, this article is not about providing figures on the expected valuation of the token, but rather about providing the community with a deeper analysis to better understand its meaning and valuation context. This includes going through the (vast) differences between a Utility Token and a Company Share since I understand it is still blurry in some people’s mind. I will incorporate my thoughts and perspective on these matters, which should not be regarded as a single source of truth but rather as an attempt to “dig deeper”.
In order to share these thoughts with you in the most pertinent manner, I have actually entirely modelled the Quant Treasury function and analysed how the QNT token would react to various scenarios based on a number of different factors. That does not mean there is any universal truth to be told, but it did help in clarifying how things work (with my understanding of the current ruleset at least, which may also evolve over time). This is an important safety net: if the intensity of speculation in crypto markets was to go lower from here, what would happen to the token price? How would Quant Treasury help support it? If the market can feel comfortable with such situation and the underlying demand for the token, then it can feel comfortable to take it higher based on future growth expectations — and that’s how it should be.
Finally, to help shed light on different areas, I must confess that I will have to go through some technicalities on how this all works and what a Utility Token actually is. That is the price to pay to gain that further, necessary knowledge and be in a position to assess the situation more thoroughly — but I will make it as readable as I possibly can, so… if are you ready, let’s start!

A Utility Token vs. a Company Share: what is the difference?

It is probably fair to say that many people involved in the crypto space are unfamiliar with certain key financial terms or concepts, simply because finance is not necessarily everyone’s background (and that is absolutely fine!). In addition, Digital Assets bring some very novel concepts, which means that everyone has to adapt in any case.
Therefore, I suggest we start with a comparison of the characteristics underpinning the QNT Utility Token and a Quant Network Company Share (as you may know, the Company Shares are currently privately held by the Quant Network founders). I believe it is important to look at this comparison for two reasons:
  1. Most people are familiar with regular Company Shares because they have been traded for decades, and it is often asked how Utility Tokens compare.
  2. Quant Network have announced a plan to raise capital to grow their business further (in the September 2019 Forbes article which you can find here). Therefore, regardless of whether the Share Offering is made public or private, I presume the community will want to better understand how things compare and the different dynamics behind each instrument.
So where does the QNT Utility Token sit in Quant Network company and how does it compare to a Quant Network Company Share? This is how it looks:
https://preview.redd.it/zgidz8ed74y31.png?width=1698&format=png&auto=webp&s=54acd2def0713b67ac7c41dae6c9ab225e5639fa
What is on the right hand side of a balance sheet is the money a company has, and what is on the left hand side is how it uses it. Broadly speaking, the money the company has may come from the owners (Equity) or from the creditors (Debt). If I were to apply these concepts to an individual (you!), “Equity” is your net worth, “Debt” is your mortgage and other debt, and “Assets” is your house, car, savings, investments, crypto, etc.
As you can see, a Company Share and a Utility Token are found in different parts of the balance sheet — and that, in itself, is a major difference! They indeed serve two very different purposes:
  • Company Shares: they represent a share of a company’s ownership, meaning that you actually own [X]% of the company ([X]% = Number of shares you possess / Total number of shares) and hence [X]% of the company’s assets on the left hand side of the balance sheet.
  • Utility Tokens: they are keys to access a given platform (in our case, Quant Network’s Operating System: Overledger) and they can serve multiple purposes as defined by their Utility Document (in QNT’s case, the latest V0.3 version can be found here).
As a consequence, as a Company Shareholder, you are entitled to receive part or all of the profits generated by the company (as the case may arise) and you can also take part in the management decisions (indeed, with 0.00000001% of Apple shares, you have the corresponding right to vote to kick the CEO out if you want to!).
On the other hand, as a Utility Token holder, you have no such rights related to the company’s profits or management, BUT any usage of the platform has to go through the token you hold — and that has novel, interesting facets.

A Utility Token vs. a Company Share: what happens in practice?

Before we dig further, let’s now remind ourselves of the economic utilities of the QNT token (i.e. in addition to signing and encrypting transactions):
  1. Licences: a licence is mandatory for anyone who wishes to develop on the Overledger platform. Enterprises and Developers pay Quant Network in fiat money and Quant Treasury subsequently sets aside QNT tokens for the same amount (a diagram on how market purchases are performed can be found on the Overledger Treasury page here). The tokens are locked for 12 months, and the current understanding is that the amount of tokens locked is readjusted at each renewal date to the prevailing market price of QNT at the time (this information is not part of the Utility Token document as of now, but it was given in a previous Telegram AMA so I will assume it is correct pending further developments).
  2. Usage: this relates to the amount of Overledger read and write activity performed by clients on an ongoing basis, and also to the transfer of Digital Assets from one chain to another, and it follows a similar principle: fiat money is received by Quant Network, and subsequently converted in QNT tokens (these tokens are not locked, however).
  3. Gateways: information about Gateways has been released through the Overledger Network initiative (see dedicated website here), and we now know that the annual cost for running a Gateway will be 500 QNT whilst Gateway holders will receive a percentage of transaction fees going through their setup.
  4. Minimum holding amounts: the team has stated that there will be a minimum QNT holding amount put in place for every participant of the Overledger ecosystem, although the details have not been released yet.
That being said, it now becomes interesting to illustrate with indicative figures what actually happens as Licences, Usage and Gateways are paid for and Quant Network company operates. The following diagram may help in this respect:
Arbitrary figures from myself (i.e. no currency, no unit), based on an indicative 20% Net Income Ratio and a 40% Dividend yield
We have now two different perspectives:
  • On the right hand side, you see the simplified Profit & Loss account (“P&L”) which incorporates Total Revenues, from which costs and taxes are deducted, to give a Net Income for the company. A share of this Net Income may be distributed to Shareholders in the form of a Dividend, whilst the remainder is accounted as retained profits and goes back to the balance sheet as Equity to fund further growth for instance. Importantly, the Dividend (if any) is usually a portion of the Net Income so, using an indicative 40% Dividend yield policy, shareholders receive here for a given year 80 out of total company revenues of 1,000.
  • On the left hand side, you see the QNT requirements arising from the Overledger-related business activity which equal 700 here. Note that this is only a portion of the Total Revenues (1,000) you can see on the right hand side, as the team generates income from other sources as well (e.g. consultancy fees) — but I assume Overledger will represent the bulk of it since it is Quant Network’s flagship product and focus. In this case, the equivalent fiat amount of QNT tokens represents 700 (i.e. 100% of Overledger-related revenues) out of the company’s Total Revenues of 1,000. It is to be noted that excess reserves of QNT may be sold and generate additional revenues for the company, which would be outside of the Overledger Revenues mentioned above (i.e. they would fall in the “Other Revenues” category).
A way to summarise the situation from a very high level is: as a Company Shareholder you take a view on the company’s total profits whereas as a Utility Token holder you take a view on the company’s revenues (albeit Overledger-related).
It is however too early to reach any conclusion, so we now need to dig one level deeper again.

More considerations around Company Shares

As we discussed, with a Company Share, you possess a fraction of the company’s ownership and hence you have access to profits (and losses!). So how do typical Net Income results look in the technology industry? What sort of Dividend is usually paid? What sort of market valuations are subsequently achieved?
Let’s find out:
https://preview.redd.it/eua9sqlt74y31.png?width=2904&format=png&auto=webp&s=3500669942abf62a0ea1c983ab3cea40552c40d1
As you can see, the typical Net Income Ratio varies between around 10% and 20% in the technology/software industry (using the above illustrated peer group). The ratio illustrates the proportion of Net Income extracted from Revenues.
In addition, money is returned to Company Shareholders in the form of a Dividend (i.e. a portion of the Net Income) and in the form of Share repurchases (whereby the company uses its excess cash position to buy back shares from Shareholders and hence diminish the number of Shares available). A company may however prefer to not redistribute any of the profits, and retain them instead to fund further business growth — Alphabet (Google) is a good example in this respect.
Interestingly, as you can see on the far right of the table, the market capitalisations of these companies reflect high multiples of their Net Income as investors expect the companies to prosper in the future and generate larger profits. If you wished to explore these ideas further, I recommend also looking into the Return on Equity ratio which takes into account the amount of resources (i.e. Capital/Equity) put to work to generate the companies’ profits.
It is also to be noted that the number of Company Shares outstanding may vary over time. Indeed, aside from Share repurchases that diminish the number of Shares available to the market, additional Shares may be issued to raise additional funds from the market hence diluting the ownership of existing Shareholders.
Finally, (regular) Company Shares are structured in the same way across companies and industries, which brings a key benefit of having them easily comparable/benchmarkable against one another for investors. That is not the case for Utility Tokens, but they come with the benefit of having a lot more flexible use cases.

More considerations around the QNT token

As discussed, the Utility Token model is quite novel and each token has unique functions designed for the system it is associated with. That does not make value assessment easy, since all Utility Tokens are different, and this is a further reason to have a detailed look into the QNT case.
https://preview.redd.it/b0xe0ogw74y31.png?width=1512&format=png&auto=webp&s=cece522cd7919125e199b012af41850df6d9e9fd
As a start, all assets that are used in a speculative way embed two components into their price:
A) one that represents what the asset is worth today, and
B) one that represents what it may be worth in the future.
Depending on whether the future looks bright or not, a price premium or a price discount may be attached to the asset price.
This is similar to what we just saw with Company Shares valuation multiples, and it is valid across markets. For instance, Microsoft generates around USD 21bn in annual Net Income these days, but the cost of acquiring it entirely is USD 1,094bn (!). This speculative effect is particularly visible in the crypto sector since valuation levels are usually high whilst usage/adoption levels are usually low for now.
So what about QNT? As mentioned, the QNT Utility model has novel, interesting facets. Since QNT is required to access and use the Overledger system, it is important to appreciate that Quant Network company has three means of action regarding the QNT token:
  1. MANAGING their QNT reserves on an ongoing basis (i.e. buying or selling tokens is not always automatic, they can allocate tokens from their own reserves depending on their liquidity position at any given time),
  2. BUYING/RECEIVING QNT from the market/clients on the back of business activity, and
  3. SELLING QNT when they deem their reserves sufficient and/or wish to sell tokens to cover for operational costs.
Broadly speaking, the above actions will vary depending on business performance, the QNT token price and the Quant Network company’s liquidity position.
We also have to appreciate how the QNT distribution will always look like, it can be broken down as follows:
https://preview.redd.it/f20h7hvz74y31.png?width=1106&format=png&auto=webp&s=f2f5b63272f5ed6e3f977ce08d7bae043851edd1
A) QNT tokens held by the QNT Community
B) QNT tokens held by Quant Network that are locked (i.e. those related to Licences)
C) QNT tokens held by Quant Network that are unlocked (i.e. those related to other usage, such as consumption fees and Gateways)
D) the minimum QNT amount held by all users of the platform (more information on this front soon)
So now that the situation is set, how would we assess Quant Network’s business activity effect on the QNT token?
STEP 1: We would need to define the range of minimum/maximum amounts of QNT which Quant Network would want to keep as liquid reserves (i.e. unlocked) on an ongoing basis. This affects key variables such as the proportion of market purchases vs. the use of their own reserves, and the amount of QNT sold back to the market. Also, interestingly, if Quant Network never wanted to keep less than, for instance, 1 million QNT tokens as liquid reserves, these 1 million tokens would have a similar effect on the market as the locked tokens because they would never be sold.
STEP 2: We would need to define the amount of revenues that are related to QNT. As we know, Overledger Licences, Usage and Gateways generate revenues converted into QNT (or in QNT directly). So the correlation is strong between revenues and QNT needs. Interestingly, the cost of a licence is probably relatively low today in order to facilitate adoption and testing, but it will surely increase over time. The same goes for usage fees, especially as we move from testing/pilot phases to mass implementation. The number of clients will also increase. The Community version of Overledger is also set to officially launch next year. More information on revenue potential can be found later in this article.
STEP 3: We would need to define an evolution of the QNT token price over time and see how things develop with regards to Quant Network’s net purchase/sale of tokens every month (i.e. tokens required - tokens sold = net purchased/sold tokens).
Once assumptions are made, what do we observe?
In an undistorted environment, there is a positive correlation between Quant Network’s QNT-related revenues and the market capitalisation they occupy (i.e. the Quant Network share of the token distribution multiplied by the QNT price). However, this correlation can get heavily twisted as the speculative market prices a premium to the QNT price (i.e. anticipating higher revenues). As we will see, a persistent discount is not really possible as Quant Treasury would mechanically have to step in with large market purchases, which would provide strong support to the QNT price.
In addition, volatility is to be added to the equation since QNT volatility is likely to be (much) higher than that of revenues which can create important year-on-year disparities. For instance, Quant Treasury may lock a lot of tokens at a low price one year, and be well in excess of required tokens the next year if the QNT token price has significantly increased (and vice versa). This is not an issue per se, but this would impact the amount of tokens bought/sold on an ongoing basis by Quant Treasury as reserves inflate/deflate.
If we put aside the distortions created by speculation on the QNT price, and the subsequent impact on the excess/deficiency of Quant Network token reserves (whose level is also pro-actively managed by the company, as previously discussed), the economic system works as follows:
High QNT price vs. Revenue levels: The value of reserves is inflated, fewer tokens need to be bought for the level of revenues generated, Quant Treasury provides low support to the QNT price, its share of the token distribution diminishes.
Low QNT price vs. Revenue levels: Reserves run out, a higher number of tokens needs to be bought for the level of revenues generated, Quant Treasury provides higher support to the QNT price, its share of the token distribution increases.
Summary table:
https://preview.redd.it/q7wgzpv384y31.png?width=2312&format=png&auto=webp&s=d8c0480cb34caf2e59615ec21ea220d81d79b153
The key here is that, whatever speculation on future revenue levels does to the token in the first place, if the QNT price was falling and reaching a level that does not reflect the prevailing revenue levels of Overledger at a given time, then Quant Treasury would require a larger amount of tokens to cover the business needs which would mean the depletion of their reserves, larger purchases from the market and strong support for the QNT price from here. This is the safety net we want to see, coming from usage! Indeed, in other words, if the QNT price went very high very quickly, Quant Treasury may not be seen buying much tokens since their reserves would be inflated BUT that fall back mechanics purely based on usage would be there to safeguard QNT holders from the QNT price falling below a certain level.
I would assume this makes sense for most, and you might now wonder why have I highlighted the bottom part about the token distribution in red? That is because there is an ongoing battle between the QNT community and Quant Treasury — and this is very interesting.
The ecosystem will show how big a share is the community willing to let Quant Network represent. The community actually sets the price for the purchases, and the token distribution fluctuates depending on the metrics we discussed. An equilibrium will be formed based on the confidence the market has in Quant Network’s future revenue generation. Moreover, the QNT community could perceive the token as a Store of Value and be happy to hold 80/90% of all tokens for instance, or it could perceive QNT as more dynamic or risky and be happy to only represent 60/70% of the distribution. Needless to say that, considering my previous articles on the potential of Overledger, I think we will tend more towards the former scenario. Indeed, if you wished to store wealth with a technology-agnostic, future proof, globally adopted, revenue-providing (through Gateways) Network of Networks on which most of the digitalised value is flowing through — wouldn’t you see QNT as an appealing value proposition?
In a nutshell, it all comes down to the Overledger revenue levels and the QNT holders’ resistence to buy pressure from Quant Treasury. Therefore, if you are confident in the Overledger revenue generation and wish to see the QNT token price go up, more than ever, do not sell your tokens!
What about the locked tokens? There will always be a certain amount of tokens that are entirely taken out of circulation, but Quant Network company will always keep additional unlocked tokens on top of that (those they receive and manage as buffer) and that means that locked tokens will always be a subset of what Quant Network possesses. I do not know whether fees will primarily be concentrated on the licencing side vs. the usage side, but if that were to be the case then it would be even better as a higher amount of tokens would be taken out of circulation for good.
Finally, as long as the company operates, the revenues will always represent a certain amount of money whereas this is not the case for profits which may not appear before years (e.g. during the first years, during an economic/business downturn, etc.). As an illustration, a company like Uber has seen vast increases in revenues since it launched but never made any profit! Therefore, the demand for the QNT token benefits from good resilience from that perspective.
Quant Network vs. QNT community — What proportion of the QNT distribution will each represent?

How much revenues can Overledger generate?

I suggest we start with the basis of what the Quant Network business is about: connecting networks together, building new-generation hyper-decentralised apps on top (called “mApps”), and creating network effects.
Network effects are best defined by Metcalfe’s law which states: “the effect of a telecommunications network is proportional to the square of the number of connected users of the system” (Source: Wikipedia). This is illustrated by the picture below, which demonstrates the increasing number of possible connections for each new user added to the network. This was also recently discussed in a YouTube podcast by QNT community members “Luke” and “Ghost of St. Miklos” which you can watch here.
Source: applicoinc.com
This means that, as Overledger continues to connect more and more DLTs of all types between themselves and also with legacy systems, the number of users (humans or machines) connected to this Network of Networks will grow substantially — and the number of possible connections between participants will in turn grow exponentially. This will increase the value of the network, and hence the level of fees associated with getting access to it. This forms the basis of expected, future revenue generation and especially in a context where Overledger remains unique as of today and embraced by many of the largest institutions in the world (see the detailed summary on the matter from community member “Seq” here).
On top of this network, multi-chain hyper-decentralised applications (‘mApps’) can be built — which are an upgrade to existing dApps that use only one chain at a time and hence only benefit from the user base and functionalities of the given chain. Overledger mApps can leverage on the users and abilities of all connected chains at the same time, horizontal scaling, the ability to write/move code in any language across chains as required, write smart contracts on blockchains that do not support them (e.g. Bitcoin), and provide easier connection to other systems. dApps have barely had any success so far, as discussed in my first article, but mApps could provide the market with the necessary tools to build applications that can complement or rival what can be found on the Apple or Google Play store.
Also, the flexibility of Overledger enables Quant Network to target a large number of industries and to connect them all together. A sample of use cases can be found in the following illustration:
https://preview.redd.it/th8edz5b84y31.png?width=2664&format=png&auto=webp&s=105dd4546f8f9ab2c66d1a5a8e9f669cef0e0614
It is to be noted that one of the use cases, namely the tokenisation of the entire world’s assets, represents a market worth hundreds of trillions of USD and that is not even including the huge amount of illiquid assets not currently traded on traditional Capital Markets which could benefit from the tokenisation process. More information on the topic can be found in my previous article fully focused on the potential of Overledger to capture value from the structural shift in the world’s assets and machine-related data/value transfers.
Finally, we can look at what well established companies with a similar technology profile have been able to achieve. Overledger is an Operating System for DLTs and legacy systems on top of which applications can be built. The comparison to Microsoft Windows and the suite of Microsoft Software running on top (e.g. Microsoft Office) is an obvious one from that perspective to gauge the longer term potential.
As you can see below, Microsoft’s flagship softwares such as Windows and Office each generate tens of billions of USD of revenues every year:
Source: Geekwire
We can also look at Oracle, the second largest Enterprise software company in the world:
Source: Statista
We can finally look at what the Apple store and the Google Play store generate, since the Quant Network “mApp store” for the community side of Overledger will look to replicate a similar business model with hyper-decentralised applications:
Source: Worldwide total revenue by app store, 2018 ($bn)
The above means total revenues of around USD 70bn in 2018 for the Apple store and Google Play store combined, and the market is getting bigger year-on-year! Also, again, these (indicative!) reference points for Overledger come in the context of the Community version of the system only, since the Enterprise version represents a separate set of verticals more comparable to the likes of Microsoft and Oracle which we just looked at.

Conclusion

I hope this article helped shed further light on the QNT token and how the various market and business parameters will influence its behavior over time, as the Quant Network business is expected to grow exponentially in the coming years.
In the recent Forbes interview, Quant Network’s CEO (Gilbert Verdian) stated : “Our potential to grow is uncapped as we change and transform industries by creating a secure layer between them at speed. Our vision is to build a mass version of what I call an internet of trust, where value can be securely transferred between global partners not relying on defunct internet security but rather that of blockchain.”.
This is highly encouraging with regards to business prospects and also in comparison to what other companies have been able to achieve since the Web as we know it today emerged (e.g. Microsoft, Google, Apple, etc.). The Internet is now entering a new phase, with DLT technology at its core, and Overledger is set to be at the forefront of this new paradigm which will surely offer a vast array of new opportunities across sectors.
I believe it is an exciting time for all of us to be part of the journey, as long as any financial commitment is made with a good sense of responsibility and understanding of what success comes down to. “Crypto” is still immature in many respects, and the emergence of a dedicated regulatory framework combined with the expected gradual, selective entrance of institutional money managers will hopefully help shed further light and protect retail token holders from the misunderstandings, misinformation and misconduct which too many have suffered from in the last years.
Thanks for your time and interest.
Appendix:
First article: “The reasons why Quant Network (QNT) will rise to the Top of the crypto sphere in the coming months”
Second article: “The potential of Quant Network’s technology to capture value from the structural shift in the World’s assets and machine-related data/value transfers”
October 2019 City AM interview of Gilbert Verdian (CEO): Click here
October 2019 Blockchain Brad interview of Gilbert Verdian (CEO): Click here
July 2019 Blockchain Brad interview of Gilbert Verdian (CEO): Click here
February 2019 Blockchain Brad interview of Gilbert Verdian (CEO): Click here
----
About the original author of the article:
My name is David and I spent years in the Investment Banking industry in London. I hold QNT tokens and the above views are based on my own thoughts and research only. I am not affiliated with the Quant Network team in any way. This is not investment advice, please do your own research and understand what you are buying before doing so. It is also my belief that more than 90% of all other crypto projects will fail because what matters is what is getting adopted; please do not put more money at risk than you can afford to lose.
submitted by mr_sonic to CryptoCurrency [link] [comments]

The exponential growth in the crypto market and our vision

The exponential growth in the crypto market and our vision
Starting 2009 and absolutely zero, bitcoin went through and survived a baptism of fire. It was attacked and declared dead hundreds of times, yet each time rose from the ashes and reached new highs, peaking at tens of thousands US dollars in 2017. The unprecedented growth and born greed rooted deeply in humanity induced fickle speculation, short-termism, and pump-and-dumps, which effectively imposes a leverage on the entire market and magnifies all price movements. This is largely the reason all cryptos suffer from supreme volatility.
Nevertheless, we as a professional quantitative team in the stock and crypto markets saw abundant investment opportunities amid noisy data with conventional research tools, including but not limited to factor analysis and fundamental valuation. Quantitative research methodologies continue to offer stable returns and strong predictability in our out-of-sample tests and live trading, which is consistent with our backtest result that dates back to 2015.

https://preview.redd.it/49z9z28vhua41.png?width=1200&format=png&auto=webp&s=5dfa30fee0293ab27167d50fb3262e31c268baa7
submitted by CryptoSmartBeta to CryptoCommonwealth [link] [comments]

Why you should invest in OCEAN Protocol

Why I am investing in Ocean Protocol
tl;dr
Unlocking data for AI
Partnered with; Unilever, Roche, Johnson&Johnson, Aviva, MOBI (BMW, Ford, GM)
Currently at $0.03, IEO price $0.12, ICO price $0.2.
Staking coming Q2.
THE PROBLEM
The world has a data problem. The more we create, the more we are forced to entrust it all to fewer data monopolies to profit from.
Data is also siloed, and generally hosted on proprietary databases across vast systems, geographies and business units. Whilst there have been fixes and APIs that have helped improve the sharing of corporate and public data, fundamentally this doesn’t change the fact that client-server architecture and corporate IT networks are inherently designed to prevent data sharing.
Regulation and privacy laws combine to make organisations concerned about sharing data both internally and publicly unless forced to do so. The Health Insurance Portability and Accountability Act (HIPAA) in the US or the Data Protection Act in the UK explicitly state how and what data can and cannot be shared. But these are complicated policies. The technical difficulty of implementing them, combined with bad UX means people err on the side of caution when approaching these issues. There is simply no incentive to outweigh the risk and hassle of sharing data.
Even where sharing is encouraged, current infrastructure makes monetising data through open source licensing complex and equally difficult to enforce. So ultimately, you are left with two options: give your data away for free (which what most individuals do) or hoard it and see if you can make sense of it at some time in the future (which is what most companies do). Neither is very efficient or effective.
The consequence is a few increasingly powerful companies get the vast majority of data at little cost, and large amounts of valuable data are sat dormant in siloed databases.
Simply put, there is no economic incentive to share data. This is a massive issue in the AI market (expected to be worth $70 billion in 2020 according to BoA Merrill).
The best AI techniques today, such as deep learning, need lots (and lots) of quality and relevant datasets to deliver any kind of meaningful value. Starving most new entrants (such as startups and SMEs) of the ability to compete.
AI expertise and talent is expensive and hard to come by, typically concentrating within organisations that already have the data to play with or promise to generate vast quantities of it in the future. Companies like Google, Facebook, Microsoft and Baidu swallow up almost all the best talent and computer science and AI PhDs before they even come onto the jobs market.
This creates a self-propagating cycle, increasingly benefiting a few established organisations who are able to go on to dominate their respective markets, extracting a premium for the priviledge. Think of Facebook & Google in the Ad Market, Amazon for Retail, now imagine that happening across every single industry vertical. Data leads to data network effects, and subsequent AI advantages which are extremely hard to catch up with once the flywheel starts. The way things are going, the driver-less car market will likely consolidate around one single software provider. As old industries like education, healthcare and utilities digitize their operations and start utilizing data, the same will likely happen there too.
The benefits of the 4th Industrial Revolution are in the hands of fewer and fewer organisations.
Currently the expectation is that companies, rather than trying to compete (if they want to stay in business), are expected to concede their data to one of the big tech clouds like Amazon or Microsoft to be able to extract value from it. Further extending the suppliers’ unfair advantage and increasing their own dependency. Look at autonomous vehicles, German manufacturers unable to compete with Silicon Valley’s AIs for self driving cars could be left simply making the low-value hardware whilst conceding the higher-value (and margin) software to companies that drive the intelligence that control them.
I’ve always argued companies don’t want Big Data. They want actionable intelligence. But currently most large organisations have vast dumb data in silos that they simply don’t know what to do with.
But what if…
they could securely allow AI developers to run algorithms on it whilst keeping it stored encrypted, on-premise.
And open up every database at a ‘planetary level’ and turn them into a single data marketplace.
Who would own or control it? To be frank, it would require unseen levels of trust. Data is generally very sensitive, revealing and something you typically would not want to share with your competitors. Especially in say, consumer health how could that be possible with complex privacy laws?
What’s needed is a decentralised data marketplace to connect AI developers to data owners in a compliant, secure and affordable way. Welcome to Ocean Protocol.
Why decentralised and tokenised?
Primarily because of the need for the provenance of IP, affordable payment channels, and the ensure no single entity becomes a gatekeeper to a hoard of valuable data. Gatekeeper, in the sense that they can arbitrarily ban or censor participants but also to avoid the same honeypot hacking problems we encounter in today’s centralised world.
But aren’t there already decentralised data market projects?
The Ocean team have focused their design on enabling ‘exchange protocols’, resulting in massive potential for partnerships with other players in the domain. As investors in IOTA, understanding how this could work with their Data Marketplace is an interesting case in point.
INNOVATIONS
What we like most about Ocean is they have been deploying many of the constituent parts that underpin this marketplace over the last 4 years via a number of initiatives which they are now bringing together into one unified solution:
(digital ownership & attribution) (high throughput distributed database to allow for high throughput transactions) (Scalability – build on proven BigchainDB / IPDB technology for “planetary scale”) (blockchain-ready, community-driven protocol for intellectual property licensing)
What is being added is a protocol and token designed to incentivize and program rules and behaviours into the marketplace to ensure relevant good quality data is committed, made available and fairly remunerated. The design is prepared for processing confidential data for machine learning and aggregated analysis without exposing the raw data itself. Ocean will facilitate in bringing the processing algorithms to the data through on-premise compute and, eventually, more advanced techniques, like homomorphic encryption, as they mature.
OCEAN Token
Think of the Ocean Token as the ‘crypto asset’ that serves as the commodity in the data economy to incentivise the mass coordination of resources to secure and scale the network to turn in to actionable intelligence.
If Ocean is about trading data, can’t it use an existing cryptocurrency as its token, like Bitcoin or Ether?
While existing tokens might serve as a means of exchange, the Ocean protocol requires a token of its own because it uses its a specific form of monetary policy and rewards. Users get rewarded with newly minted tokens for providing high quality, relevant data and keeping it available. This means the protocol requires control over the money supply and rules out using any existing general purpose protocols or tokens. Furthermore, from the perspective of Ocean users, volatility in an uncorrelated token would disrupt the orderly value exchange between various stakeholders in the marketplace they desire.
OCEAN Data Providers (Supplying Data)
Actors who have data and want to monetise it, can make it available through Ocean for a price. When their data is used by Data Consumers, Data Providers receive tokens in return.
OCEAN Data Curators (Quality Control)
An interesting concept to Ocean is the application of curation markets. Someone needs to decide what data on Ocean is good and which data is bad. As Ocean is a decentralised system, there can’t be a central committee to do this. Instead, anyone with domain expertise can participate as a Data Curator and earn newly minted tokens by separating the wheat from the chaff. Data Curators put an amount of tokens at stake to signal that a certain dataset is of high quality. Every time they correctly do this, they receive newly minted tokens in return.
OCEAN Registry of Actors (Keeping Bad Actors Out)
Because Ocean is an open protocol, not only does it need mechanisms to curate data, it needs a mechanism to curate the participants themselves. For this reason a Registry of Actors is part of Ocean, again applying staking of tokens to make good behaviour more economically attractive than bad behaviour.
OCEAN Keepers (Making Data Available)
The nodes in the Ocean network are called Keepers. They run the Ocean software and make datasets available to the network. Keepers receive newly minted tokens to perform their function. Data Providers need to use one or more Keepers to offer data to the network.
BRINGING IT ALL TOGETHER
Ocean is building a platform to enable a ‘global data commons’. A platform where anyone can share and be rewarded for the data they contribute where the token and protocol is designed specifically to incentivise data sharing and remuneration.
So let’s see that in the context of a single use-case: Clinical Trial Data
Note: that this use-case is provided for illustrative purposes only, to get a feel for how Ocean could work in practice. Some of the specifics of the Ocean protocol have yet to be finalised and published in the white paper, and might turn out different than described here.
Bob is a clinical physician with a data science background who uses Ocean. He knows his industry well and has experience understanding what types of clinical data are useful in trials. Charlie works at a company that regularly runs medical trials. He has collected a large amount of data for a very specific trial which has now concluded, and he believes it could be valuable for others but he doesn’t know exactly how. Charlie publishes the dataset through Ocean and judging its value (based on the cost to produce and therefore replicate), as well as his confidence in its overall quality, he stakes 5 tokens on it (to prove it is his IP, which if people want to use they must pay for). Charlie uses one of the Keeper nodes maintained by his company’s IT department. Bob, as a Data Curator of clinical trial data on Ocean, is notified of its submission, and sees no one has challenged its ownership. By looking at a sample he decides the data is of good quality and based on how broad its utility could be he stakes 10 Ocean tokens to back his judgement. Bob is not alone and quickly a number of other Data Curators with good reputation also evaluate the data and make a stake. By this point a number of AI developers see Charlie’s dataset is becoming popular and purchase it through Ocean. Charlie, Bob and the other curators get rewarded in newly minted tokens, proportional to the amount they staked and the number of downloads. The Keeper node at Charlie’s company regularly receives a request to cryptographically prove it still has the data available. Each time it answers correctly, it also receives some newly minted tokens. When Bob and Charlie signed up to join Ocean, they staked some tokens to get added to the Registry of Actors. Eve also wants to join Ocean. She stakes 100 tokens to get added to The Registry of Actors. Eve is actually a malicious actor. She purchases Charlie’s dataset through Ocean, then claims it’s hers and publishes it under her own account for a slightly lower price. Furthermore, she creates several more “sock puppet” accounts, each with some more tokens staked to join, to serve as Data Curators and vouch for her copy of the dataset. Bob and Charlie discover Eve’s malice. They successfully challenge Eve and her sock puppet accounts in the Registry of Actors. Eve and her sock puppet accounts get removed from the Registry of Actors and she loses all staking tokens.
APPROACH, TRACTION & TEAM
I am greatly encouraged by the fact that Ocean were aligned to building what we term a Community Token Economy (CTE) where multiple stakeholders ( & ) partner early on to bring together complementary skills and assets.
As two existing companies (one already VC backed) they are committing real code and IP already worth several million in value*.
*This is an important point to remember when considering the valuation and token distribution of the offering.
The open, inclusive, transparent nature of IPDB foundation bodes well for how Ocean will be run and how it will solve complex governance issues as the network grows.
I am also impressed with the team’s understanding of the importance of building a community. They understand that networks are only as powerful as the community that supports it. This is why they have already signed key partnerships with XPrize Foundation, SingularityNet, Mattereum, Integration Alpha and ixo Foundation as well as agreeing an MOU with the Government of Singapore to provide coverage and indemnification for sandboxes for data sharing.
The team understands that the decentralisation movement is still in its early stages and that collaborative and partnership is a more effective model than competition and going it alone.
PLACE IN THE CONVERGENCE ECOSYSTEM STACK
Ocean protocol is a fundamental requirement for the Convergence Ecosystem Stack. It is a protocol that enables a thriving AI data marketplace. It is complementary to our other investments in IOTA and SEED both of whom provide a marketplace for machine data and bots respectively.
Marketplaces are critical to the development of the Convergence Ecosystem as they enable new data-based and tokenised business models that have never before been possible to unlock value. Distributed ledgers, blockchains and other decentralization technologies are powerful tools for authenticating, validating, securing and transporting data; but it will be marketplaces that will enable companies to build sustainable businesses and crack open the incumbent data monopolies. IOTA, SEED and now Ocean are unlocking data for more equitable outcomes for users.
submitted by Econcrypt to CryptoMoonShots [link] [comments]

Cryptocurrency Software Market Top-Vendor and Industry Analysis by End-User Segments 2026

The Analysis report titled “Cryptocurrency Software Market 2026” highly demonstrates the current Cryptocurrency Software market analysis scenario, impending future opportunities, revenue growth, pricing and profitability of the industry.
Growth Analysis Report on “Cryptocurrency Software Market size | Industry Segment by Applications, by Type Regional Outlook, Market Demand, Latest Trends, Cryptocurrency Software Industry Share & Revenue by Manufacturers, Company Profiles, Growth Forecasts – 2026.” Analyzes current market size and upcoming year’s growth of this industry.
Key Companies Analysis: Binance, Coinbase, Poloniex, LocalBitcoins, BTCC, Bittrex, Kucoin, Bitfinex, Kraken, Cryptopia, and Electroneum
CLICK TO GET SAMPLE REPORT OF CRYPTOCURRENCY SOFTWARE
New vendors in the market are facing tough competition from established international vendors as they struggle with technological innovations, reliability and quality issues. The report will answer questions about the current market developments and the scope of competition, opportunity cost and more.
This report focuses on the global Cryptocurrency Software status, future forecast, growth opportunity, key market and key players. The study objectives are to present the Cryptocurrency Software development in United States, Europe, China, Japan, Southeast Asia, India, Central & South America.
The Cryptocurrency Software market is a comprehensive report which offers a meticulous overview of the market share, size, trends, demand, product analysis, application analysis, regional outlook, competitive strategies, forecasts, and strategies impacting the Cryptocurrency Software Industry. The report includes a detailed analysis of the market competitive landscape, with the help of detailed business profiles, SWOT analysis, project feasibility analysis, and several other details about the key companies operating in the market.
Reasons for Buying this Report:
This ​Cryptocurrency Software Market report provides pin-point analysis for changing competitive dynamics
It provides a forward-looking perspective on different factors driving or restraining Market growth
It provides a six-year forecast assessed on the basis of how the Market is predicted to grow
It helps in understanding the key product segments and their future
It provides pin point analysis of changing competition dynamics and keeps you ahead of competitors
It helps in making informed business decisions by having complete insights of Market and by making in-depth analysis of Market segments.
CLICK TO GET MORE ENQUIRY ABOUT THE REPORT (WITH FULL TOC & DESCRIPTION)
Table of Contents:
Global Cryptocurrency Software Market Size, Status and Forecast 2020-2026
Chapter One: Report Overview
Chapter Two: Global Growth Trends
Chapter Three: Market Share by Key Players
Chapter Four: Breakdown Data by Type and Application
Chapter Five: ​ United States
Chapter Six: ​ Europe
Chapter Seven: ​ China
Chapter Eight: ​ Japan
Chapter Nine: India
Chapter Ten: Central & South America
Chapter Eleven: International Players Profiles
Chapter Twelve: Market Forecast 2020-2026
Chapter Thirteen: Analyst’s Viewpoints/Conclusions
submitted by techreport12 to u/techreport12 [link] [comments]

Bitcoin & Cryptocurrency Tax Software Market | Analytical Overview, Growth Factors, Demand and Trends by 2026

The Analysis report titled “Bitcoin & Cryptocurrency Tax Software Market 2026” highly demonstrates the current Bitcoin & Cryptocurrency Tax Software market analysis scenario, impending future opportunities, revenue growth, pricing and profitability of the industry.
Growth Analysis Report on “Bitcoin & Cryptocurrency Tax Software Market size | Industry Segment by Applications, by Type Regional Outlook, Market Demand, Latest Trends, Bitcoin & Cryptocurrency Tax Software Industry Share & Revenue by Manufacturers, Company Profiles, Growth Forecasts – 2026.” Analyzes current market size and upcoming year’s growth of this industry.
Key Companies Analysis: CoinTracking, TokenTax, BearTax, CryptoTrader.tax, ZenLedger, Bittax, Node40, HappyTax & More.
CLICK TO GET SAMPLE REPORT OF BITCOIN & CRYPTOCURRENCY TAX SOFTWARE
New vendors in the market are facing tough competition from established international vendors as they struggle with technological innovations, reliability and quality issues. The report will answer questions about the current market developments and the scope of competition, opportunity cost and more.
This report focuses on the global Bitcoin & Cryptocurrency Tax Software status, future forecast, growth opportunity, key market and key players. The study objectives are to present the Bitcoin & Cryptocurrency Tax Software development in United States, Europe, China, Japan, Southeast Asia, India, Central & South America.
The Bitcoin & Cryptocurrency Tax Software market is a comprehensive report which offers a meticulous overview of the market share, size, trends, demand, product analysis, application analysis, regional outlook, competitive strategies, forecasts, and strategies impacting the Bitcoin & Cryptocurrency Tax Software Industry. The report includes a detailed analysis of the market competitive landscape, with the help of detailed business profiles, SWOT analysis, project feasibility analysis, and several other details about the key companies operating in the market.
Reasons for Buying this Report:
This ​Bitcoin & Cryptocurrency Tax Software Market report provides pin-point analysis for changing competitive dynamics
It provides a forward-looking perspective on different factors driving or restraining Market growth
It provides a six-year forecast assessed on the basis of how the Market is predicted to grow
It helps in understanding the key product segments and their future
It provides pin point analysis of changing competition dynamics and keeps you ahead of competitors
It helps in making informed business decisions by having complete insights of Market and by making in-depth analysis of Market segments.
CLICK TO GET MORE ENQUIRY ABOUT THE REPORT (WITH FULL TOC & DESCRIPTION)
Table of Contents:
Global Bitcoin & Cryptocurrency Tax Software Market Size, Status and Forecast 2020-2026
Chapter One: Report Overview
Chapter Two: Global Growth Trends
Chapter Three: Market Share by Key Players
Chapter Four: Breakdown Data by Type and Application
Chapter Five: ​ United States
Chapter Six: ​ Europe
Chapter Seven: ​ China
Chapter Eight: ​ Japan
Chapter Nine: India
Chapter Ten: Central & South America
Chapter Eleven: International Players Profiles
Chapter Twelve: Market Forecast 2020-2026
Chapter Thirteen: Analyst’s Viewpoints/Conclusions
submitted by techreport12 to u/techreport12 [link] [comments]

BITCOIN WILL DUMP BEFORE MASSIVE PUMP!!!!? + Secret ... IS THIS A MASSIVE BEARISH DIVERGENCE OR NOT???? Bitcoin ... Bitcoin To 13.000$ NEXT Week!!? - ATH Start Of 2021 ... Bitcoin & Chainlink Price Analysis!!! - (Has The DUMP ... bitcoin price analysis WILL DO SOMETHING INSANE RIGHT SOON ...

For in-depth analysis, Table 9 shows long-run relations between Bitcoin returns and gold price returns for the whole sample period and for the most and the least volatile sub-periods. The bounds F-statistics show long-run relations, that is cointegration, between Bitcoin returns and changes in gold price returns in some cases. In the whole sample period, cointegration between Bitcoin and gold ... Bitcoin's TAAR (transaction amount to active addresses ratio) is hovering around 7-month highs, potentially adding a fundamental validation to bitcoin's latest price growth. Latest Bitcoin price and analysis (BTC to USD) Oliver Knight. Coin Rivet. 20 October 2020. Reblog. Share. Tweet. Share. Bitcoin has enjoyed a barnstorming rally on Tuesday having surged to the $12,000 mark for the first time since September 1. The world’s largest cryptocurrency now has a long-awaited bull market within its grasp, it just needs to close a daily candle above the $12,500 of ... If we want to predict the Bitcoin price of 11840.01 USD we are going to use the data of all cryptocurrencies an hour $(t-1)$ and two hours ago $(t-2)$. Train / Test Sample Split. The same function can be used to split the input data into train and test samples. All we have to do is to decide what fraction of time-series (counting from their beginning) we are going to use for training of the ... Bitcoin Price Analysis: BTC decouples from the S&P 500, bulls double it down . Posted By: Investor. Bitcoin’s correlation to the US stock market has reduced to zero for the first time since May. The positive momentum may be at risk due to a high level of uncertainty. A sustainable move above $13,250 will take BTC to the moon. Bitcoin (BTC) recovered above $13,000 and hit the intraday high at ...

[index] [34012] [48235] [360] [31680] [24443] [48837] [31670] [2158] [8205] [41458]

BITCOIN WILL DUMP BEFORE MASSIVE PUMP!!!!? + Secret ...

Welcome to Team Underground, I (Thomas) do weekly BTC price analysis on YouTube. I've been full time trading bitcoin for over a year now and I've decided to ... 🎥: Bitcoin & Chainlink Have Both Seen Big Resistance!!! - Have They Topped out? 🔔: Like, Subscribe & Turn on Notifications 🚩: Join My Trading Group: 👉 https:... Bitcoin targets, price analysis, news today. Bitcoin and chainlink prediction. Bullish or bearish? 👏 THUMBS UP & SUBSCRIBE NOW 🤑! 🔔Want to j... I think this is the beginning of the second crash so smaller pumps in bitcoin price can be expected before larger dumps. Watch rings for pivots, resistance a... Welcome to Team Underground, I (Thomas) do weekly BTC price analysis on YouTube. I've been full time trading bitcoin for over a year now and I've decided to ...

#