
How should Bitcoin mining be structured? Pavel Moravec of SlushPool (Braiins) and Ryan Ellis of Laurentia Pool join me to chat. We discuss:
- Why pools exist
- How to structure mining pools
- Trade offs around decentralisation of pools
- Coinbase payouts vs other ways
- Lightning payouts
- Scalability for mining pools
- Stratum v2
Links:
- Pavel Moravec: @mor_pav
- Braiins Twitter: @braiins_systems
- Braiins site: Braiins.com | Bitcoin Mining Company
- Laurentia Pool Twitter: @laurentiapool
- Site: LaurentiaPool.org
- Laurentia pool post: Laurentia Pool: Pooled “Decentralization”
- Braiins twitter thread on decentralisation: https://twitter.com/braiins_systems/status/1423291322128011264
Sponsors:
- Swan Bitcoin
- Hodl Hodl Lend
- Compass Mining
- Unchained Capital (code LIVERA)
- CypherSafe (code LIVERA)
- CoinKite.com (code LIVERA)
Stephan Livera links:
- Show notes and website
- Follow me on Twitter @stephanlivera
- Subscribe to the podcast
- Patreon @stephanlivera
Podcast Transcript:
Stephan Livera:
So Pavel and Ryan, welcome to the show.
Ryan Ellis:
Thanks for having me.
Pavel Moravec:
Yeah. Thanks.
Stephan Livera:
So Pavel. I think people know you pretty well. You’ve been on the show a couple of times. Ryan, did you want to just introduce yourself to the listeners?
Ryan Ellis:
Yeah. My name is Ryan Ellis. I’m [the] owner of minefarmbuy.com and laurentiapool.org. I guess cooperator of Laurentia.
Stephan Livera:
Great. And any details there in terms of your background or at least what you’re comfortable to share?
Ryan Ellis:
Just a lot of logistics background sales of course. And…
Stephan Livera:
Yeah. And so Pavel, maybe just for listeners who aren’t familiar with you, can you just give us a bit of a background on yourself as well?
Pavel Moravec:
Well, yeah. Mostly computer science for grooming the grounds, but lately I do spend most of my time trying to run the company. It’s just like Braiins obviously, but yeah, it’s a completely different position for me learning a lot of stuff. But I try to be as close to the tech and programming as I can. We do run a slush pool. We do have this different branch in the company building a mining firmware. Braiins OS+. trying to do other R and D stuff in mining space, especially between mining space. So yeah, busy in this area.
Stephan Livera:
Gotcha. Yeah. So the reason we wanted to have a discussion and I think there was sort of, is it a debate? Is it a discussion? I think we are a little bit more on the discussion side, but there’s been some discussion about this idea of should mining pools and should mining in general be further decentralized and what ways are feasible for it to decentralize. So perhaps we could start a little bit of how we got here. Maybe Pavel, you could help us explain why did mining pools even start in the first place and why is that even a thing?
Pavel Moravec:
Yeah, it is pretty easy idea, right? If you’re trying to mine Bitcoin your chance of doing so successfully is pretty low. If you don’t have a large enough share in the whole hash rate market, let’s say. so the pool solves the problem of you not getting rewards frequently enough, basically spreading a variance and helping you get money very often, even though you don’t have to be the one lucky to find a new block. So it’s a group of people joining versus to get her to mine blocks.
Stephan Livera:
Yeah. And so I guess just for listeners, if you’re totally new, let’s say if you were to just buy one ASIC machine and plug it in, you might be waiting in a very long time if you were just trying to solo mine, let’s say. And so that’s part of the reason why the idea with pools as Pavel was just explaining, you can think of it. Like it’s smoothing out that the rewards for you so that you can get the more consistent payouts. Now, Ryan, I want to bring you in. I want to get your thoughts on this idea. Do you agree or disagree with Pavel there, or how would you want to discuss this idea of mining pools?
Ryan Ellis:
Well, as far as pooled mining, it’s really — in my opinion, essential. So I don’t see any disagreement there at all with regards to a single ASIC. Yeah. Your reward cadence is very sparse. If that makes sense. I think a new ASIC even the latest gen probably net you a block find within about 15,000 days, maybe; maybe less 13,000 days. So who mining is sort of the atmosphere we’re in not necessarily by choice, but by necessity.
Stephan Livera:
Gotcha. And so then I think the question then is more about how should those pools work, how should they be formed and how should the payouts be done? Because I think that’s where maybe some of the disagreement might lie. So Ryan, do you want to spell out where your disagreement is?
Ryan Ellis:
Well, essentially any pool operator can operate their pool the way they would. Like, I don’t see much argument in that regard, I’m not going to tell Pavel and slush how to do things, so to speak. So when we get sort of further into decentralization and maybe topics like that, we might have butt heads a little bit, or a lot of it, I guess we’ll find out. But as far as Laurentia pool really just the model is a self custody for miners. It’s definitely not a pool designed for your small sort of home mining community. As much as I would like to be more inclusive still the risks involve when you are mining scores like ours SPLNS and then I think slushes on PPLNS and you can correct me if I’m wrong there, but those are really models and scores driven around actual network mining versus sort of your flat rate PPS score systems. And operation-wise we’re looking at, for us with our Coinbase strive reward we’re limited by firmware to the small user group. And that’s where we sort of utilize that to hopefully captivate industrial and enterprise miners.
Stephan Livera:
I see. So Ryan, could you just spell out for listeners, how is Laurentia Pool different from most of the other mining pools out there? What’s the differentiating factor there?
Ryan Ellis:
Well, I think there may be another pool that operates as far as the Coinbase derived payout, but that is the biggest catalyst there is receiving your mining reward directly from the network and reducing any intermediaries. Focusing on self custody and ownership was just very important to me to collaborate with Khan and deliver something to market that we thought would be of high interest.
Stephan Livera:
I see. And Khan is your partner in the mining operation or in the operation of Laurentia Pool?
Ryan Ellis:
Khan has been around a while. I’m not going to really speak for him I guess, so to speak, he co-author CG miner originally. He’s had his own pools CK pools, CK solo, and all based around more of your sort of private and direct payout. So non-custodial environments.
Stephan Livera:
I see. And so perhaps it’s time to go back to Pavel and get your explanation, how does slush pool to mining and then the mining payouts, if you could just outline a little bit of that.
Pavel Moravec:
Yeah. What a pool typically does is as we discussed it, it groups other people’s hash rate. And then when a block is found, it is typically found to an address controlled by a pool operator. During the mining process, a lot of data is collected about the performance of various miners and based on the rate provided to the group. So the pool, the miners are rewarded certain portion of the found block. There are different scoring systems, how you can calculate what you deserve based on your hash rate in time, and some other various senators, but mostly every scoring system tries to somehow be the correct one or… Once there is an amount that you deserve known the pool creates a transaction and sends the money to you. There are various mechanisms how to prevent transaction dust and how not to send you too many transactions, because you would have to, for example, pay a lot of fees for using a lot of outputs.
Pavel Moravec:
And so-and-so typically a miner can set up rules in what time, what frequency, what amounts the [inaudible] should be done. So in slushpool, for example, you can say, Hey, I want payables when I mined 0.1 Bitcoin, and then send it to me or do it daily, or some other options, which gives you the Bitcoins in more concise transaction outputs. Basically, there is obviously a drawback in it, which is the pool operator can hold the coins, which you’re deserving for some time for bigger miners. It can be. And for example, electrical case, it can be within an hour or hour and something, and we can send it to you whenever there is a block found your reward is big enough, and transaction can go out because it’s a substantial in case of small miners. And it is still vast majority of our miners. It can take weeks, even months before there is a reasonable amount for being paid out. Then obviously there is some risk associated with it, but it is a way how to prevent some technical issues with very small transactions.
Stephan Livera:
Excellent. So put it in other words, it’s essentially that because you know, the miner, I guess the crucial difference here is in that sense, slush pool is custodial for a small period of time or in the case of a small miner, it might be for a longer period of time until they’ve had enough to reach that threshold until they actually get the payout. So I guess that’s probably the key difference that I’m understanding and you guys correct me if I’m wrong there. And so I guess the another interesting question people might be thinking is, and you mentioned, I’m sorry, go on, Ryan.
Ryan Ellis:
Oh yeah. I was just going to interject. So our Coinbase drive payout it’s scored weighted on share quality quantity, and then at every block find it is dispersed to each miner individually at the Coinbase level.
Stephan Livera:
I see. Yeah. And so that’s part of that, trade-off which we’ll get to, and perhaps if you could just explain the part around why you want to prevent dust. Why it’s preventing dust a good idea?
Pavel Moravec:
Every transaction has some resources associated with it. If you want to send a Bitcoin transaction, it needs to be [inaudible] to blockchain and what we, let’s not discuss some layer to payout schemes, which are completely outside of things. we investigated this before, but it’s a completely different story, right? So if you want to put the payout on directly to the blockchain, it has some cost it’s takes workspace. And so you don’t want to want to be allocating the space for very small amounts because otherwise nobody would be able to basically send a transaction and the fees would go through the roof and transaction fees associated with the payout could be easily of the size of the [inaudible] itself. So you want to prevent this. And as a pool operator, you need to consider the cost associated with sending the coins, because it is either direct as every custodial per operator knows or indirect in terms of reserving space in [inaudible] address, because the bytes in blocks are just bytes in blocks and there are limited. So sending very small transactions has associated a price or cost with it.
Stephan Livera:
Yeah. And essentially that it be bad as well from the perspective of the person receiving lots of small amounts of coins UTX is because then later when they go to spend it again, it’s going to be very costly for them. And so that can be quite a costly and maybe not as scalable approach, but I’m wondering as well, Ryan, what’s your philosophy and thinking on that, and are you essentially getting around that by saying it should definitely be for larger miners who are doing the laryngeal approach, or do you have a different answer on that idea?
Ryan Ellis:
Well, as far as dust, I mean, there’s really no argument they’re essentially using block space efficiently is I think critical. So sending a dust transaction is really can be impossible depending on your sats per byte, so to speak with Laurentia Pool. And our, I guess since we’re limited to including the pool fee to 20 Coinbase drive payouts or texts, so those will be issued by the blockchain and available within a hundred confirmations as a standard for the network. And as far as limiting having the limit to our pool, we’re likely not to receive, or a user miner is likely not to receive a dust payout based on the breakdown. But again maybe may
Pavel Moravec:
I have a question to you because I’m not sure if I understand how exactly your pool is operating. So you somehow limit the amount of outputs you’re putting into the Coinbase. So to prevent dust, for example
Ryan Ellis:
No essentially firmware is the limit. So ASIC firmware, I guess, depending on manufacturer will vary with the ability to put out multiple Coinbase transactions and that’s sort of the position we’re in. So a few years ago I Khan had his own sort of group pool and not a solo pool. And you would literally see postpone payments, which would be your dust payments on his school there. So with our pool, there’s not enough user count to attribute dust, unless there was maybe a very small, miner contributing to the pool against a larger set of, I guess, more robust industrial or enterprise miners.
Pavel Moravec:
Why do you think is the case that the firmware is preventing the scheme because we kind of know something about firmware and, but we definitely not know all the details of how stock firmware for example works, but typically it doesn’t have any about number of outputs in Coinbase because it’s not available information to the miners, right?
Ryan Ellis:
Yeah. So as far as the developers, I guess, changing the code or, and I guess the argument would be to make it more efficient, but in that regard limits pools such as Laurentia pool Kahn’s prior group pool with allowing these ASICS to participate on their rules with multiple Coinbase techs,
Stephan Livera:
Are you able to comment, which are there any in particular that you could name or is it a general comment?
Ryan Ellis:
Biggest bottleneck right now happens to be Ant Miner. So being one of the more pop manufacturers and they’re the catalyst to our limit in our Chrome base. But so with, ant miner, we’re only allowed 20 texts. So two are reserved for our pool operators, myself and con, and then we have 18 open for users and miners
Stephan Livera:
As in there are 18 potential pool users after accounting for yourself and Khan?
Ryan Ellis:
Correct. Yeah. So with that, as far as block space, we’re not really looking at hauling away a lot of sort of size for such a small group. So as far as reward or not filling our blocks completely, we like most pools, I imagine take the highest fees possible. So whether that’s a small transaction with a high fee or a large transaction with a low fee, but obviously larger in size because of the quantity of coins being moved, we’re still able to compete to as far as a complete network block subsidy with other groups,
Pavel Moravec:
It’s pretty common that miners are understanding [inaudible] to include as many transactions as possible, or the optimization function is get as much transaction fees into the blog as possible. So that the payables will be the largest spirited common strategy. And I don’t imagine there is a economic limit. I didn’t mind are not doing this which probably most of the miners are right. Maybe one commands to the firmware limitation. I don’t think it is intentional myself. We did study a lot of available code basis for firmwares before, because obviously you, you try to get as much information as possible and it is just stupid firmware. It is just stupid code with buffers predetermined before. And the Coinbase transaction is just send us random bytes or opaque bytes. There is no parsing and it’s needed for the firmware. It’s just give me a bunch of bytes.
Pavel Moravec:
Then some space you can play with to mine more efficiently, and then other bunch of whites. And this is sent to the firmware every single time when a new job is offered by the pool, which is roughly every minute, every 30 seconds. And all these bytes are just smashed together, put some random [inaudible] in it. And mine and the firmware of, is they needs to do some byte by manipulation and the limitation in my opinions, just based on assumption that the Coinbase transaction has some maximum size and the assumption and the firmware is probably not correct, because [inaudible] transaction in Bitcoin, there are some limitations which are heart, but that the buffers in for Mirage is not a big enough for it. And we can speculate obviously about, is it intentional or is it just crappy software?
Pavel Moravec:
And my strong opinion would be it’s just crappy software because all the rest of the code base is basically, yeah, put the monkey behind the computer, and once it’s doing something, let’s release it. And it works so that we will copy it to the next generation, that generation, the next generation. And unfortunately, in case of pools, who try to do these things, they are obviously hit by this limitation, which is set, but because the transaction can be in principle larger, but unfortunately it’s going to be easily done or forced to miners.
Ryan Ellis:
Well, I think intentional is probably a correct term. I don’t know. I wouldn’t say, I mean, you could speculate if it is sort of disingenuous or you know, as like an attack on Bitcoin. I don’t know if I would go that far, but most of the operators or ASIC manufacturers operate their own pools. And sort of the consensus is in the pooled environment is typically a single, maybe multiple in to Coinbase payouts to the pool. So seeing even larger than that is probably just deemed as not essential. And that sort of mostly my light interpretation of why these changes have taken place.
Pavel Moravec:
I don’t have a solution for you. I just thought about the limitation right now. And you could, you could do a trick like this. You can put one output to the Coinbase and keep it very small to some address you control and to the same block immediately after the Coinbase, you can put any number of payout transactions as you wish. So you would dip on decompose the first transaction immediately in the same block. And then you can just use the, all the transaction space in the block and never hit the problem of like two large gangrenous transaction. And you would not even discuss or show but the problem then is that people will not see the transactions directly or the miners, but all of your blocks will be visible after that. So you could site set the problem of Coinbase, a size limitation and not to do any custodial thing.
Pavel Moravec:
If we don’t understand the one transaction moves to address a, an immediate laughter that a moves to one payout, he moves to a second payout as a slightly more complicated on the service side. But still doable. Maybe you could them support more than 18 users this way, because in principle, I kind of liked the idea of not holding the coins for longer period of time, but there are very strong technical limitations associated with it and not all the miners really need it or wants it, but there is some space or some people who can have this preference. And it’s great if there is some offering for them, but you know, maybe you could try this thing. I don’t see why it shouldn’t work, but maybe I’m missing something.
Ryan Ellis:
Yeah. I’m not — Khan would be the more technical person to have and sort of discuss a lot of those characteristics being so familiar, obviously with the code he’s developed. So I think with us and having a Coinbase transaction again, derive from the block reward is just probably the easiest solution to deliver sort of, I guess, the product that we are looking to bring to market as far as self custody and direct payments with without any intermediaries or custodial solutions which also helps us as far as managing our own organization. So we don’t have to manage payouts after the fact everything is done to score. And then [inaudible]
Pavel Moravec:
Yeah, buddy, you kind of have to, because somebody has to define how the Coinbase transaction should look like. So you are actually generating the [inaudible] transaction through the block. So you’re doing the [inaudible], but the obvious pros for the miner the coins are never controlled by you once they are mined, but yeah, the outputs are still generated by your system. So you are doing the payouts in some ways… The scoring apps. Yeah.
Stephan Livera:
And if I could ask then is the idea then with the Laurentia Pool is the idea that you’re just openly saying, we’ll just have a smaller pool in terms of number of users and in doing so we’ll be non-custodial and that’s one of the trade-offs like you get what I’m saying, like it’s your custody for less time because you just get that direct payout, but the trade-off is you just kind of have as many users on the pool as to let’s say slush pool, which could have many, many more users. Right?
Ryan Ellis:
Correct. Yeah. So again, in the past, Khan was able to have thousands of users on this group pool. Before the firmware changes you know, by that specific manufacturer we mentioned, and it it’s disappointing, but for me, as far as operational side, it actually makes my job a lot easier to manage a smaller groups. So I’m not gonna complain too much about it, but yeah.
Pavel Moravec:
Lucky you.
Ryan Ellis:
Yeah, absolutely. Besides having to feel the constant inquiries on bringing in their hundred terahertz ASIC and when they expect to pay out, so us having the password and having our server locked is sort of to alleviate that issue at the gate. You know, it kind of sucks to be a gatekeeper in that aspect. Cause it’d be nice to have a open pool that people could join in and join the out sort of freely, but that poses more problems as far as keeping rewards consistent for the group as well. So for me, operationally, it’s streamlined and then we’re able to secure our server. And as far as getting I guess when we do finally get a hash rate on there [inaudible] that all the users or peers are willing to accept them.
Stephan Livera:
So Ryan, can I ask then are you proposing this idea that more and more people should be using this kind of model? Or do you just see it, like, this is a niche model. Like I guess the question I’m asking is how do you see this idea of scaling to more and more people, if each time you run this kind of pool it’s maximum of 20 people?
Ryan Ellis:
Well, it kind of goes to our philosophy of pool dispersion. So I’m sort of in the mindset that decentralization in a pool environment is sort of unachievable because you’re looking at centralized servers maybe outside of P2Pool. And that’s for us we can spin up a server not Australia, we can put one up in you and sort of have segregated groups too, if there’s demand for it. So if we find this successful and people want sort of, I guess, quote this product, unquote we’re happy to deliver that and continue to
Pavel Moravec:
Yeah. But then there is a like inherent problem and pool mining and decentralization because the whole key point of pool is working as a group to limit the variance. So we can try to like cut the problem into several pieces, but there are basically two sides of the, of the centralization problem. One is money handling who controls the coins and how, and can we double-check it. And can we prevent some like issues related to money handling to certain extent, this is solved by buyouts in the Coinbase, but you can argue that the server centralized service still makes the decision about who gets the coins. You can maybe slightly faster detect some possible fraud, but normally there’s not so much problems related with not getting money out of pool. So this is one part which is centralized, who pays and who controls the coins in any way.
Pavel Moravec:
And the second part is who decides what block is being mined, right? Who chooses the transactions and the block who constructs the block as a new candidate for extending the whole blockchain. And this is arguably the more important part of the whole centralization, decentralization topic, because like how miners are being paid for to work is like transitional thing is it, it’s a momentary thing once they are paid it’s over. But th the blockchain as, as a set of blocks and transactions, whatever it’s just being kept for a very long time and preventing somebody using the blockchain infrastructure, for example, by transaction censoring or things like that, it has direct impact to the whole ecosystem. It’s not like mining problem where me as a miner and you as a pool, we have an agreement that you’re paying me properly, and we can start sorting this out by some whatever agreement we make, and I can enforce it by law or whatever.
Pavel Moravec:
This is pretty like is just part of the mining industry. Doesn’t expand to the whole Bitcoin space so much, but a block selection or transaction selection is it influences the whole Bitcoin space, or at least much more than the first. And it is even bigger problem and even more complicated problem to solve, because it goes directly against we are behaving as one miner together in the pool. And once we want to address the second problem of choosing the transactions, being mined on them, it can be done, but it’s technically even more complicated than handling the money part. And this is something what we investigated in stratum V2 for recall, which doesn’t address the money part, because we don’t think it’s a crucial thing. and it has so many technical problems, which are not worth like working on because it just works even today, we think, but the block construction or transaction selection part is not addressed at all by any solution you’re ever off, which would be usable. And, yeah, I’m not claiming it’s a perfect ideal solution, which we propose in V2. But still it’s the best F work we were ever off.
Ryan Ellis:
Yeah. For stratum V2. I can agree that it helps with the decentralization of the text selection. But still again, I think the point where you are collaborating with a pool still the miners with the highest hash rate are going to have the probability in their favor to have their templates you know, propagated to the chain. So that still in itself is a little centralized, but it’s still definitely an a, in my opinion, a good move away from pools who like shadow mine and do very I think devious things like that. And the pool environment that we see.
Pavel Moravec:
Yeah, yeah, sure. But the in ideal world, when all miners are using Stratum V2, which I would like to see some time, but I don’t think it will happen in so-so broadly or whatever in the ideal world, when everybody would use transaction selection in V2, then the blog distribution would be the same as if everybody is solo mining. So I cannot imagine being in better, if somebody has a huge data center full of ASICS and he solo mines, then obviously he would have larger influence on what blocks will be mined. It’s pretty direct. There is no way around that, and nobody wants to change it, I think because it’s just your right to choose whatever you want to mine, because you have the hardware you made the investment, make your decision. And so by using V2, we could get as close as possible to this ideal, but, but still being able to solve the money distribution problem and variance, smoothing problems with this, because before on the money site paid by some level of centralization, because somebody is providing the service of smoothing the variance. But it’s a trade off, obviously.
Ryan Ellis:
That’s where a scores like PPS sort of attract miners as well. So getting a sort of flat fee for your hash looks in my opinion, good on paper. But again, over time all block reward should really even out and with accelerated fees on pools that run PPS you know, maybe there’s some backdoor deals or whatever, but they’re essentially miners are losing revenue over the course of a long period of time for that. Short-Term.
Stephan Livera:
So can we just dive into that a little bit? Like, what is the, I mean, it sounds to me like you have a disagreement, like a fundamental disagreement with the way points Bishare works what’s your issue with that? And what’s the alternative that you’re proposing?
Ryan Ellis:
Well, the, I think the alternative it’s really what we’re doing with Laurentia pool. And I think I made the point that a miner has the ability to choose. So that to me is more important than anything else. Having that ability to look at all the pools and the ecosystem and deciding what thereafter specifically. So as far as myself and my interpretation of you know, points per share, being less ideal over time you know, really just my interpretation of scoring and who are mining. So for me…
Stephan Livera:
Can we just dive into that a little bit further, you’re saying, it’s your interpretation of the scoring? So what is it about the scoring of PPS that you don’t like, just so we can try to understand where is the actual disagreement here?
Ryan Ellis:
Well, there’s a lot of things a pool can do with your hash rate. So if they are just offering you points for share, you might not necessarily know what chamber mining I didn’t get the time. Granted I think there’s more transparency with pools than others, but, and some people do prefer to sort of profits switch, which I think is really a net loss for them as far as working proof of work algorithms. Because moving from one chain to another, you’re essentially disrupting your probability to [inaudible] fine. At least that’s my interpretation of it. and doing so you’re less likely to capture a consistent reward in that regard. So to me, the pool takes on extra risk with the PPS score and in that they typically charge a higher fee which again is up to the pool operator and the willingness of the user to accept that.
Stephan Livera:
So Pavel. Do you want to jump in here with a comment or a question?
Pavel Moravec:
The comments will be, there is a lot of like rewarding schemes out there, and obviously people can choose whatever pool they want. But the fee associated with the provided service, especially for PPS pools, it is paid with because the randomness of rewards causes operational troubles. Also it has economic value to get the money getting regularly. You can plan much more, much better than you can if your income is, I don’t know, this month is 10% of the next month, which can easily happen. So especially PPS schemes are adding value to the miner. And obviously somebody can say, this is not important for me because my case is so that I don’t need it. But most of the miners, especially the ones who make a lot of like large investments, all lending money, the regular payouts are super critical to them getting money for a mining operations without knowing that by running your miners, you will be paid. And what amount, it’s just crazy for some like more economically sensitive people. If it’s your money, it’s your page miner then. Yeah. Obviously in the long term you can solo mine. There is no fee associated and it should work out. But yeah, it’s a playing dice basically.
Stephan Livera:
Yeah. So I think that’s the way to think of it is it’s similar to the variability smoothing aspects, right. But even if you were a worker working for your boss and that boss is taking on the risk, that he might not make a profitable product, but he’s paying you regardless. And so I guess maybe it’s a similar way here because it’s like you as a miner, let’s say I’m a small miner, I’ve only got a few Asics. Then it really makes sense for me to use that model because if not, I’m taking on the full risk myself. So I guess maybe Ryan that’s what you were saying is more like, oh, we should have these more, I guess, decentralized idea of smaller pools, but each of those users is bigger such that they can maybe handle the variability and deal with it themselves. But I guess that is part of the trade-off because you might be a mining operator, a person who has, well, you’ve got electricity to pay, you might be paying for Rackspace, you might be paying for staff. You might be paying various other admin and operating expenses or buying new equipment. So would you agree with that kind of summary there, Ryan, or do you disagree with the way I was trying to summarize it?
Ryan Ellis:
I have no disagreement with that at all. As far as you know, score in relation to that, it’s really up to the user to figure out the importance looking at industrial or enterprise type miners some have boards of directors, others are just a group of guys that are sort of in a you know, company or maybe a collective or something like that. And they work hard to do what you guys described as secure their equipment, secure their power rates. You know, you gotta pay electricity and internet and any other bills associated with most businesses as well. So having, I think a regular payout system is it’s captivating. I won’t disagree that. Or do I think that it’s getting, I think that’s the direction we’re pursuing with Laurentia is really getting back to actual network mining, even if that is solo PPL and SPLNS.
Ryan Ellis:
I think those score rates are very, I guess, some lacking some of the verbiage I’m looking for. But those score rates are intuitive to how Bitcoin network operates and you know, propagating block finds and the probability of, so when people, I think I had somebody talk about a large gap in a block find for a pool, and I was just saying, well, how many block finds did they find previously? And so we went through their history and looked at their hash rate and basically saw that they were quote ahead of schedule and quilt. So the probability that they’re going to have a 12 hour gap between the block find is something that it’s just math, you can expect to have your reward within X amount of time.
Stephan Livera:
Okay. So Ryan, can you give us an idea of maybe obviously things move around and so on, but as we speak today, August, 2021, roughly how big does a miner need to be for them for, to make sense for them to join Laurentia Pool, as opposed to, you know,slush poolor some other pool?
Ryan Ellis:
Well, based on our organization, it really depends on the other peers in our pool. So looking at the other users. So let’s say, I think right now our commitment total is about 40 a hash. So if we were to get a large miner, come in obviously the average we would need to hit a specific cadence would change for the benefit of the pool. And then the reverse would be true. So a larger commitment or a smaller commitment, sorry, would increase that average for the remaining of spaces opening as far as organizing the group. What I look to achieve is entry at a specific cadence that works for people. So looking at like a five day, 10 day cadence specifically around your difficulty adjustments is what I’m looking for to bring a group in. I don’t want to bring people in and not be able to set that expectation for them. And then you know, still provide the caveat that this is, there could be a gap. Let’s say we had 700 Penta Hertz online we could approximate maybe a block fine per day. So if that cadence sort of extended, we could speculate and calculate really that will be due for a block soon. And then likely our next block would come quicker because of the gap and the probability.
Stephan Livera:
Okay. And so I guess did in your vision then is that there’d be maybe more people who are, let’s say ideologically motivated to quote unquote, maintain the decentralization of mining, such that there were lots of Laurentia pools out there and have, I guess, more smaller pools. And I guess that’s, I’m trying to understand your view is that essentially what you’re trying to get at is that the idea you’re going for?
Ryan Ellis:
You’re close there. So again, the pool dispersion, I think will help decentralization if it’s us using Khan’s code over and over again. I don’t see that as really you know, decentralizing when we look at tech selection and things like that, because we’re going to pull the highest fees possible. Likely if we put our code on another server, that text templates are going to be similar or very similar. So as far as…
Pavel Moravec:
But is it a problem? I mean let’s imagine that there are some fees in the mem pool, right? Because people are just trying to send money and if a transaction propagation and the Bitcoin network is efficient, then everybody in the world should basically see all the same transactions. Because once you send your transaction to the network the notes are just relaying the transaction to everybody. So all the miners modular, private transactions are seeing the same same space. So it, it should be very natural that if you are trying to optimize your output the value of block, because it’s economically making sense, then all of miners should make basically the same, same decisions. Everybody should build the same block. So like difference between one block and under a blog for different pools, doesn’t make a lot of economical sense. The incentive for doing this is not there. the key concept, I think here is you don’t want one person to control and force everybody to use this block, but you can collectively decide this is the best block for me, because if I am the person who is finding it, then I’m getting the money and I’m maximizing my output and you can do the same decision. And if you win, it’s your money. If I win, it’s my money but we can mine the same block. Everybody can do it, and it’s perfectly fine,
Ryan Ellis:
Right. Perfectly fine. But yeah, my point was towards a decentralization of text selection. So I guess assumption with Stratum V2 is the miner themselves. We get to choose versus the pool. So us sort of dispersing pools all over, I don’t think would, is not really a decentralization, I guess, a directive for us. So we don’t really market ourselves as a decentralized pool, mostly moving hash rate from these larger pools that have 19 [inaudible] is sort of our goal there. So
Stephan Livera:
Yeah, I mean, it sounds a bit interesting. I mean, to me, I’m not quite clear myself, because it sounds like you’re saying you’re not necessarily pro decentralization of pools that you want essentially less large pools and more small ones?
Ryan Ellis:
I would prefer it. It’s really, again, it’s really not. I mean, that’s my preference to see better distribution of hash rate globally, but again, for us to distribute pools, Alone would be again, not very decentralized. So looking at all the other factors…
Pavel Moravec:
Yeah, because you could be the next big centralized entity, just running 200 servers and just deciding by having a drink to get her and say, Hey, tomorrow we are going to sensor this company’s transactions and the nobody would be happy with that. Obviously…
Ryan Ellis:
Well, we wouldn’t do that, but yeah…
Pavel Moravec:
But the point is you want to get into a state where this even theoretically cannot happen, or we are as close as possible to not be theoretically possible. And the problem is…
Ryan Ellis:
As far as decentralizing hash rate to me, the going back to mining nodes. So putting that into core itself again. So whether you’re just using your low power CPU there’s thousands of nodes on our planet that are active and participating. So to have each one of them be a mining node in my opinion, would be the best for decentralization. Is it going to be profitable? No. Are you going to find the block find eventually? But again, the probability is very minute,
Stephan Livera:
Right. But I guess, I mean, to the point Pavel was making earlier, it’s theoretically possible that some say Laurentia Pool has 20 miners on there and those 20 miners could collectively account for 40% of the hash rate, like theoretically it’s possible. Right? Or 60% of the hash rate, you know what I mean? Like it’s theoretically
Ryan Ellis:
And that’s sort of, I mean, we’re competition driven internally. So 18 peers fighting essentially in a ring, think of it as like a WWE yeah. May ha or whatever they call ’em those matches for the portion of block reward.
Stephan Livera:
Yeah. I see. And so let’s talk a little bit about the scalability aspects, because I know the Braiins and slush pool team put out some tweets and a post talking about some of the scalability aspects of this saying, well, what happens if the Coinbase transaction gets too large or it might not work for the smallest miners or perhaps if that Coinbase transaction was really large, it’s the first one in every block. So that means there’s less space for non miner transaction. So I’m wondering maybe Pavel, do you want to just touch on some of those aspects around scalability and the approach contrast in the approach?
Pavel Moravec:
Yeah, sure. But we kind of…
Stephan Livera:
Addressed some of it, yeah.
Pavel Moravec:
Through one through the topic before it all depends on the size of the transaction. Like the amount you’re sending, if you are forced to do the payouts and very frequent manner, meaning maybe every single block you’re mining then, and there is a lot of users, which is not the case of Laurentia pool, but yeah, theoretically, if you would like to extend it to larger number of people, then you would be forced to use, I don’t know, thousand outputs in every single block. And it would just be wasting space because you would, you could otherwise use the same bytes for other people’s and being paid by doing so, because this is your main duty as a miner, you’re helping other people’s transaction to get into blockchain and you’re rewarded by doing so. So yeah, putting a lot of outputs this way, if it’s bigger than reasonable than it’s just a pure loss, I think.
Pavel Moravec:
And then some strategy of joining the transactions to get her, maybe by just sending the payloads less frequently for one single miner, it’s a way how to reduce the allocated space in the block. And then obviously there is one aspect which is pretty technical and it is you as a pool, you’re sending the Coinbase to every single miner whenever you’re updating the job being mined on. So you have to do it every, every, every time when a new block is found, but for optimizing your walk value fools typically send new jobs to miners once in a minute or 30 seconds. And if the Coinbase transaction is large, then you have to send physically all the outputs, the whole [inaudible] transaction to every single ASIC participating in the mining. And it can be a lot of data. It takes a lot of time.
Pavel Moravec:
If you look at the timings, when you try to send this job to, I don’t know, 10,000 connections on your server, it can see them very fast on the software level because the buffers in operating systems are just doing their work, but physically on the lines it’s takes a non-negligible time. And it decreases efficiency of mining because the receiver of the information, the miner gets the data later and therefore reacts later. And the delay is pure loss. It’s the disadvantage. It’s not mining. So when a new block has found the distribution of the information is super critical. And once you you’re, that’s why even zero, zero blog or social transaction log mining is or empty block mining is happening because of this speed speed up distributing the work the work it needs to be in one packet network buckets.
Pavel Moravec:
Then you can do it pretty fast by once the information is larger, it just takes much longer, and you would have to scale up your operation, use more physical servers so that the connections are more spread around. And then you could push the same information faster, but at the same time, you’re on the farm side and you have a lot of connections through the pool. And if every single miner needs to get all the data again and again, and it can be problematic on your like downstream connection from your IPS or whatever. And so you would probably use a proxy, but it is a technical discussion, but having this going base large is just loss of river. It is, it would be great to implement the heck we discussed before to put it to as a first transactions in the block and use just one hub internally in the block so that all these outputs would be hidden and the normal transactions. And because miners are getting only Coinbase transactions in the job, this would be like zero cost. Obviously you would use some space in the block. So the associated price will be there, but not on the mining level. Mining efficiency would be completely perfect. Yeah. So there are technical problems with the approach even on the very mining sites
Ryan Ellis:
Well with Khan’s code. It it’s interesting, of course he could elaborate more. It maxes out at a hundred Coinbase transactions in a block. Obviously we’re not able to scale to that size due to firmware, but propagating the next, I guess that information really is dependent on again the size. So there’s, again, no argument there. Our size is still very small with 20 Coinbase transactions.
Pavel Moravec:
Yeah. You’re definitely not hitting the hard part of the problem. If the number is 20,
Ryan Ellis:
And then I imagine all pools are you know, using high-grade servers and connection speeds. So really, I think latency with us being just a single master node could be an issue depending on location or even your ISP. You know, the satellite has higher latency than fiber cable, et cetera. We’ve seen pings to our server that are sub 300 milliseconds, even from Australia to the USA cause cons in Australia. So these are things we evaluated and we’ve looked at
Pavel Moravec:
It doesn’t matter in this setup, if you try to send, let’s make some ridiculous example, which is completely like out of this world, but let’s say you have to put one megabyte Coinbase transaction to say to every single miner, the pink just doesn’t matter because you would have to go through sending one megabyte, every connection. And what counts is when the last miner gets all the data, then we can argue about averaging the thing. So let’s keep the meat the middle, middle of connection who gets the the, in the middle. This is what should be measured because that’s the delay before you start to mine on the new block. And the pink is just an offset you cannot be fostered in your network is, but then all the inefficiencies are adding on top of that. And so everything being the same, it is just smarter to send S less information or as small number of bytes as possible because otherwise your chain, your offset think yourself towards worse efficiency. And obviously you can get better servers, but again, if you have better servers than sending more data is worse than sending was data. So,
Ryan Ellis:
Well, the, I guess the best way large miners with you know, thousands of these ASICs would be to proxy. And that’s something that helps alleviate cause we can help manage everything on our own server, our own end, but really talking with clients and making sure that efficiencies on their side are just as optimal. Especially when working with the payout.
Pavel Moravec:
Yeah. But, but it doesn’t matter because the proxy has the same problem. You have a server, which for example, has 10,000 downstream connections to the miners, one upstream to a pool. So the time to deliver the data from pool to the proxy is let’s say the thing site the big time. Nope, no efficiency was, but then the server being in the middle needs to distribute the larger amount of data to every single connection as well. So you’re paying the price on the local network and okay, it’s maybe a hundred megabytes laying then, but there is a switch somewhere. There is a one server needing to go through all the connections. Maybe you can use two proxies, but at whatever, once you have 10,000 off machines, it just takes some time to distribute the information, even on the CPU level.
Pavel Moravec:
But yeah, these are maybe non like differences, which people are not counting and not evolving properly. But the point I’m trying to make is it is always better to not use the bandwidth if you don’t have to, because it’s faster and you can measure even the efficiencies there. It can be one good example, people part sometimes arguing about full fee differences, which are smaller than inefficiency caused by like wrong network set up and wrong proxy. But people just don’t look at these things because it’s super easy to point to a pool fee or some other fee it’s visible lumber. It’s very simple to understand, but kind of frustration.
Stephan Livera:
Yeah. But it’s an interesting point, Pavel. Ryan, do you have a response or do you have any question there?
Ryan Ellis:
Yeah. As far, well, I guess he got interrupted a little bit there, but yeah. As far as connection you know, we’re, I can’t argue that speed is going to win, but again, the probability of consecutive block fines within just a matter of seconds for a pool is extremely low within minutes obviously we’ve seen it happen before, but as far as transferring data, I, that that’s something I think every pool works hard to make efficient on their end.
Pavel Moravec:
Yeah. Maybe for one comment for people to understand mining doesn’t have memory, meaning it’s the same thing to the first second when you start to mine is exactly the same as the second, second. You’re not having bigger chance of finding a block the longer you’re mining, your probabilities are still the same. So for example, if you start now the chance of you finding a block within the next 10 minutes is exactly the same as if you mine, the 10 minutes not find the block and you’re trying for another 10 minutes. It’s yeah, it’s crazy because it’s completely non-intuitive, but it works this way. So the first second is exact, it has the same importance as the second seconds or the last second. So even once we let, let’s imagine we have one second delay at the beginning it is one 600th of a typical block time.
Pavel Moravec:
So you are losing one in 600 seconds and the whole process and the, the price you’re paying is exactly in this proportion, no matter if it’s first or last second. So yeah, there is not a lot of blocks found within first second, but the statistics just work it’s is the same as saying you’re not seeing a lot of blocks being found exactly. In 2000 seconds or 30 seconds. It’s a similar thing. Yeah. Because there’s a lot of seconds in the, in the first, first time that there is some distribution for, for them. And the first second is better than the 300 seconds. But yeah, the timing you cannot go around the maths behind mining, basically.
Stephan Livera:
Yeah. I think it was a good find. There also, are there any other things that start to break down as say the price or the stats per byte rises? So as an example, I mean, as we speak today, price is what 45,000 or 46,000 something in that range. And if you, if I look at mempool that space, the price to get into the next block is eight sats per [inaudible]. So that’s a like an average transaction fee today for next block confirmation is about 50 cents. But I could imagine if let’s say the price was 10x, the price goes to 450,000 and instead of paying eight sats per byte, you’re not paying 80 sats for buy it. Well, at that point if you’re hitting a 10 and a 10x we’re now talking about, well, it could be $50 transaction or who knows $500 transaction someday, and then you can break down.
Ryan Ellis:
Let’s have these problems!
Pavel Moravec:
Yeah, I would, love to work on the problem being that it is 10 X and we have to solve the possibly a second player. And a lot of super smart guys are working on the problem for a long time on I’m keeping all my fingers crossed for any progress in this area, but it will be great problems to have. Right.
Ryan Ellis:
And you’re looking at the larger transactions are going to, I guess way more so to speak. So likely you’re going to have more competitiveness with, I guess, your general users competing for that block space through their texts fees. And it’s really like we were just laughing about it. It’s a great problem. For miners we’re happy to help us solve that, but they’re yeah. Taproot SegWit, things like that, that reduce the size of transactions. I think we’ll see more innovations in that regard on layer one, looking at layer two, I think as much as it’s matured, it’s still not quite there. When looking at pooling a full block subsidy to a Lightning network wallet. There’s creative people like on have here that, you know solve these problems and have the know-how and the intuition as well to solve these things. So,
Stephan Livera:
Yeah. And Pavel, I’m curious while we’re here you mentioned earlier that you had explored the idea of Lightning payouts for miners. Could you offer a comment for us or explanation analysis there on feasibility of this? Cause I know for example, I think nice hash does already do this, but could you offer your thoughts?
Pavel Moravec:
Yeah. One of the troubles associated with this is payouts are one way pool never gets money, right? So, and the most efficient way how to use second player solutions is if you allow people to send both directions so that you can use the channel for like infinite time. So the idea let’s pre commit or commit some coins to a miner for channel by channel and let the miners spend the coins through us as a hub so that they can pay with the money being in the channel. And we can then send them back to the other side of the channel. And this could work for, for a long time yeah, but it, so it’s feasible, but it’s feasible for small miners using the money for like paying once there is no backflow, it doesn’t work because it, it would be better just to send the money directly. It’s a costlier to make make the channel and it, it needs to be like closed by other transection issues and so on and so on. So once there is a miner who would like to use the coins for paying for a coffee or whatever, then this could work because we will always keep your avoid for if you mine and use it. And it’s the great idea but it’s kind of niche, but
Stephan Livera:
I could also imagine, let’s say maybe there’s a synergy there for a mining pool operator to also be a lightning hub, right. To also be an operator like a routing node and maybe longer term, that would kind of make sense, because then they’re doing the payouts to their miners who are operating on that pool and then those miners. And again, again, this is a long-term thing, maybe in the long-term those miners might be intern paying their expenses with Lightning. And so then, because it’s coming through through the, let’s say slush poolslush poolwas a lot lightning routing node, as well as a mining pool operator, then maybe that model, but it’s, again, it’s a time thing. And maybe we need to sort of close the loop there for people to be able to earn PSATs from mining and then spend this as directly through the lightning network.
Pavel Moravec:
Yeah. We could even like just open the channel where a payout instead of doing direct play payout on chain, so that there is an availability of using the money within within the second layer, but it can be limited. Maybe we will just keep, I don’t know, 1.0 0.01 Bitcoin and such a challenge as do it once. And once a user
Stephan Livera:
Is going back and forward. Yeah. Yeah. So there
Pavel Moravec:
Are options and I guess somebody will do it. We definitely looked into the idea, but never prioritized the idea into being worked on. Yeah.
Stephan Livera:
And it is still early days, so it’s not it might not be feasible right now, keeping fingers
Pavel Moravec:
Crossed for somebody doing it. It would, I think it would be interesting PR for the second player, especially if some like well-known pool does it and some prominent users visible would use it and say, so it would be nice, but it’s difficult to put money on marketing projects only.
Stephan Livera:
Yeah, of course. Ryan, any comments from your side there? How does a yeah. Any comments from your side?
Ryan Ellis:
Yeah. With lightning, I think it sort of opens that gate for that smaller miner to receive maybe that daily payout sort of thing. So to me it’s really intriguing I think for adoption pool side there’s obviously a lot of work to be done. My other point with the lightning you’re also removing transactions from the main network, besides the channel opening and closing. So as far as helping keeping blocks full in the future I hope that’s a problem we never have to really discuss, but maybe game theory is there’s a lot of L two. How does that look? You know, when people are viewing them Google, so to speak. Yeah.
Stephan Livera:
I guess my perspective on that is we are still so extremely early that we just haven’t hit the big numbers yet. I mean, I’ve seen you, might’ve seen those projections of a billion Bitcoin holders in five years’ time or something like that. If we were really to hit those kinds of numbers, even just the channel open and close for all of those people would just be more than enough block space demand to provide that call it reserve demand or baseline, if you will, and add in all the people who are doing coinjoins, cause they want to do multiple rounds of coinjoins. That’s another base level of demand for block space. I personally, I’m not worried I think. And then also it’s not just channel open and closed. There will be periodic rebalance or swap in and swap out. There’ll be all sorts of things like that. So I’m personally not too worried there. I think it’s additive to the Bitcoin network. Even if it superficially looks like it’s taking transactions off the chain, it just dramatically improves the overall value of using Bitcoin because now you can use it fast and cheap and all of these fancy things, once you’re in the Lightning world,
Ryan Ellis:
I’ve been a, I guess, a Lightning user for quite a while. So operated a few nodes myself and I really hope it I think we all hope it develops more and it would be interesting to see pool side how it could be utilized in the future for sure.
Stephan Livera:
All right. Well, I think those are probably the key questions I had. I think we’ve sort of discussed where we agree disagree. Do you guys have any other comments or any areas you wanted to bring up? Any topics you wanted to bring up?
Pavel Moravec:
Nothing particular. I think we went through a lot of different points.
Stephan Livera:
Okay. I think we’ve spoken out most of those points.
Ryan Ellis:
Well, yeah, I think most of the points we were looking to cover our we did that, I think just to sort of relay as far as scaling there’s, there’s only so much we can do with that software that maybe not necessarily from Braiins and there are less but from others that are producing it and is the sort of my biggest concern is anytime you’re limiting anything you know, how does that affect the ecosystem and then smaller pools with the variable payout? it’s sort of a big topic because there’s not a whole lot, a small pool can do until they get that nominal hash rate to really drive their block find cadence.
Stephan Livera:
Right. Yeah. And I presume then maybe when a pool is getting started, they might have to just try and offer a discount or try to have their own miners to sort of bootstrap the initial hash rate to get it to a certain size. But yeah.
Pavel Moravec:
Yeah, it’s a, it’s a tough job. You have to have a lot of great connections and a lot of money, if you would like to start from scratch, even providing some benefits. Yeah. The capital needed for it is super large. And I would think there is a better way how to spend the money. You would have to have a in mining, just being a miner and actually get the coins out of the mining costs like the investments for if you would like to start, I don’t know, PPS pool today. And if you run the numbers the probabilities of going in bank fraud it’s, it’s terrible. It’s just better to use the coins. And I would do it myself,
Stephan Livera:
But I mean, you’ve already got a pool going, so it’s different for you.
Pavel Moravec:
Yeah. But, but it’s, yeah, it’s a heritage. It’s lock I’m super, super, super grateful for being in the position because otherwise it would be almost impossible to get, get to it without having all full amount of money. So
Stephan Livera:
Yeah. Okay. So I guess maybe let’s have a closing thought from each of you guys on where you think Bitcoin mining should be going in terms of the ethos of Bitcoin and potentially that idea around decentralization or what kind of ideals you have around Bitcoin mining and what people should be prioritizing when they’re thinking about mining. So Ryan, do you want to start?
Ryan Ellis:
Yeah, I’d love to. So I think decentralization you know, I’ll probably always use that in quotes because I don’t see pooled environment as decentralized. Even our pool, even with a more decentralized payout, it’s still a server that someone operates in that aspect. So really driving mining back to the node level, I think will help in sparking decentralization. Again, it’s not for, it’d be more for ethos than profitability, of course. Because essentially there’s no censoring and node operator or a node operator if being censored by their government or ISP or whoever can hopefully find a way to still contribute to the network. And then my name thousands of nodes of mining even separately is going to decentralize the network and sort of, I think it’ll move back to what it was. But pool mining, as far as profitability, I don’t really see that going away for miners, whether they’re small scale ASIC users or large industrial enterprise customers.
Stephan Livera:
Okay. And Pavel, any comments from you in terms of the direction you want to see Bitcoin mining go?
Pavel Moravec:
Yeah, it sounds something is what I would like to see in something, what I predict. I don’t think we will see smaller miners and larger numbers, it’s just inevitable that miners are getting bigger and they wants to use some sort of service, typically some sort of service as pool. So I don’t see a lot of change in this direction, but what I hope for is that these miners will understand the role of being a miner and being kind of responsible miner and trying to understand and understanding that controlling the block space and choosing what transactions will be mined. It is a big topic and they should think about it and participate whenever there is a possibility and I hope we will get the V2 or any equivalent of stratum V2 into production. And the decision makers on miner sides will understand the importance of this.
Pavel Moravec:
And we will get to the ideal of one note or one farm, one miner and not pool as close as possible by, by using this kind of protocol or any other alternative. But there is none right now. So I would really hope people will understand and use this features to help decentralize as great as possible but not on the money line level, but on the blocks of block space level, which is achievable in my opinion. So whoever is ready to help with pushing we do or any better alternative if there is some, I don’t know then please do it. Everybody will, will be rewarded by the whole Bitcoin space being more secure, more decentralized. And we will remove the argument of big miners being somewhat in a powerful position. Once we deploy a solution like this, where everybody would benefit.
Stephan Livera:
Yeah. Personally, I’d love to see more and more people get into Stratum V2, so Pavel people who want to find out more about you and Braiins and slush pool, and obviously Stratum V2. How can they find you guys online, I guess,
Pavel Moravec:
Slushpool.com Team or something or Twitter. All right.
Stephan Livera:
I’ll include the links in the show notes.
Pavel Moravec:
Braiins.com.
Stephan Livera:
Yeah. And that’s with two I’s. And Ryan, where can people find you and Laurentia Pool?
Ryan Ellis:
Laurentiapool.org and contact me through, I guess, Twitter and telegram. I use quite often minefarmbuy.com obviously is my main project right now sourcing all this equipment for all these you know, ASIC users from one, to a thousand. But yeah pretty, pretty easy to find me. You can probably knock on my door if you want it to, but…
Pavel Moravec:
Do have a good coffee.
Ryan Ellis:
As a young man, I worked many years brewing coffee, so I…
Pavel Moravec:
It’s very much appreciated to have an invitation.
Stephan Livera:
Well, I hope to. I’ve enjoyed chatting with you guys and I hope to meet up sometime at a mining conference or something someday. So thank you for joining me guys.
Pavel Moravec:
Yeah.
Ryan Ellis:
Thanks for having me great to speak with you both.