00:00 - 05:41 General Upate + Admin.
So, first question, when is sharding tests expected to take place?
05:41 okay, so sharding is kind of the the enabler for scale, so lots of components are already on the bounce on tests so a number of you might remember throughout the year we we did a bunch of to various other places a lot of those had sharding enabled to allow us to hit those numbers new 20 thousands a couple of 30s and then we were pushing up from there so a lot of the underlying theories and kind of scaffold in to do with sharding has been tested, we need to do some more once we have finalized the atom model and we have fitted network candidates that we can start to test upon, that's probably gonna become live in January and then from that point we can start to push the really high scale tests using using full sharding there were a few components of sharding that we need to finish off. The main ones being around how a node re-organizes what shard it's serving (incomprehensible) and some pieces to do with inter shard communication that the the new atom model relies on so one of the things that the atom model the new atom model does when you make a payment to somebody you would expect that that payment would go to two shards you would expect that it goes from me as the sender to piers as the receiver for example but actually because the power that Radix enables we can have some really nice permissions and tools around the tokens themselves right and we can have user permissions and all this kind of stuff so in actual fact when you when you make that transaction it hits three shards, it hits my shard as the sender, it hits the token shard and then it also hits piers’ shard and the reason for that is you can you can you can set restrictions on the token right so you can say this token can choose to be transferred to people that have this particular property in their account or even piers can refuse to receive dan tokens, what we can't really test them kind of complicated use cases until the after the atom model is finished and we've tested the atom model itself in an uncharted environment so most of the work for sharding is already done but there's there's a few things around the reorganisation of shards for nodes so they can scale back as they need to and these more complicated kind of sharding and processes to do with tokens and additions and stuff as well but I'm very eager to where to get into that and start to push these these high numbers.
09:00 Yeah I mean actually just to touch on sharding, the beauty of the radix ledger system in terms of scalability partially comes from the nodes ability to essentially bring down the number of shards it’s servicing or bring up the numbers if it's servicing in a dynamic fashion based on the total throughput of the network and the resources that that node has and at the moment you essentially have to hardcode what how much of the shard space the nodes are operating on and it's at is that bit that going from a hard coding to a dynamic that looks at the resources of the of the of the nodes and the resources and how much it can address of the shard space that is the real complexity there so the sharding itself as Dan says that's already done it's the next stage of how do you make it so that the network can efficiently manage its own resources on a part per node basis.
10:13 the main reason really for pushing it out is that if we're if we're implementing something large like like the atom model we don't want to have sharding you know dynamically moving the nodes around because if there any problems in the atom model that you're testing for just makes debugging there
10:30 I mean there are similar reasons to that as the reason that we're not implementing privacy which isn't on our near-term roadmap right because privacy has similar problems you end up having it becomes a lot more difficult to debug and especially when you add in a sharding an environment you suddenly have all of these different factors that are moving at the same time so it's much easier to build the core code, test, then build the sharding and then later when everything's likes of working in a production environment and people were able to rely on it in a global network state we can then start to look at how you can implement more complicated things on top of that again. Next question, how the nodes software will be upgraded? Automatically or manually?
11:15: We are having a lot of discussions about this at the moment it's you know to do with
the whole kind of hard fork and soft forks so what can you do on ledger or what you have to do off ledger and stuff and so we're trying to make that is user friendly, is democratic for also as efficient as possible and those conversations are still ongoing which kind of leads a little bit into the next question that I can see there about governance (What's your take on governance? On-platform or off-platform? like what others call on-chain/off-chain governance) so I'll just conflate the two questions into one. so one of the things that we don't obviously want to have are any hard forks or at least we want to minimize them as much as possible because they're probably the hardest upgrades to roll out, so there will definitely be hard forks and you want to reduce them as much as possible, I mean so so having hard Forks for or complicated feature upgrades isn’t necessarily it's bad right so like Ethereum had their hard fork or great milestone roadmap and that all went very well i people new - Hot Lips we're coming everybody agreed with their with the features that were getting so hard forks on stuff so that was that was fairly painless but if you have a DAO incident for example then that's that's a critical hard fork where the requirement comes out of nowhere really quickly and that's that's and that's not so painless as weekly as we saw so we want to try and mitigate those those painful hard forks as much as possible which is one of the reasons why we spend so much time and effort taking our time and going through all the components and testing them and making sure that we're correct because we don't want any of those if we can help it so we're trying to mitigate all the or all the issues that could cause hard forks that are critical so if we need to do a hard fork we can but we're also trying to figure out ways so the governance of the platform, whether its fees, economics or node runner incentives you know if we if we have calculated them wrong or the the level of commodity hardware that's used in the network is different to what we expected and what we've seen in current networks because it’s kind of unknown here because you can take them like a Raspberry Pi and you can support the network with it and you don't need eight core sixteen gig machine with a massive cpu in and stuff so we kind of we can't be a hundred percent confident about what hardware is gonna be used so all those kind of things, those things that need to be dynamic we're trying to figure out ways of bringing in a more democratic governance process for that over the long term, over the short term it that will be probably not possible for the MVP because on ledger governance and doing it properly is a very difficult problem and we do have a number of solutions that were looking at and I'll definitely start talking about it in public as a minor form of ideas and solve the technical challenges that face them and then we can maybe get a bit more into the the political side if you know how how how how does the network agree to change the fees or how does the network agree to to modify the economics.
14:37 Yeah so, governance is a big is a big topic that we that we have undertaken already a large amounts of research into and it's something that we want the network to be a truly decentralized network to push as much of that on chain as possible there are certain aspects that you just can't do on chain and but the the part of this is also around things like you have one side where you're like oh you can do everything on chain but then you've got to solve these game theoretical problems and these game theoretical problems lead to more complexity and how you engage with the system which can often lead to a drop off engagement of your underlying stakeholders which is like a really difficult like square to circle and so our our statement or our position nice as much as possible but our path to that will be incremental because you can just end up having single stakeholder capture happening very easily if you try and push too much too early and you don't really think about systems design on them by the way that's another $2 in the pioggy bank, because I said chain, yes, god damn it, he is going to be starting a crowdfunding soon because I I'm implementing a company-wide policy that whenever anybody says chain instead of ledger or block instead of atom they have to pull a very good amount of funds into a company into a company piggy bank depending on that position in the company, piers is currently up to about four grands.
16:00 how is the transaction fee going to be regulated I hear that it's one cent fixed but I also read over time it will drop?
so this is this is where you can I mean it's difficult to give the correct answer for that without having economics paper out there so I'll just kind of I'll try and do what job I can with the information they're lucky does really know so the idea is the not that it's fixed it ascent but that age each transaction should cost equivalent to about one cent at the beginning now obviously as time goes on the the radix tokens pegged to a dollar so that one cent at the beginning will move away right so after after a year if Radix is steadily increasing in price and say that the value of the dollar is dropping then you might actually find that that's increase to two cents or maybe three cents but the idea is that the the fee cost to do a transaction within the system stays as flat as possible so the better way to look at it is that the beginning because a radix token will be initially priced at $1 as it floats away, or your fees right so from day one it
will be 1 rad cent for for your fee but as time goes on and the value everybody increases we want to try and keep or even reduce how much the fee costs as the network grows so you'll begin to pay fractions of a rad set and we can go in that quite easily because they're gonna
economic information that we have for them from the decks and all these kind of model that drives that so that's very much kind of broad strokes so it's it's not fixed to 1 cent USD forever it will start at 1 cent radix which will be equivalent to 1 cent USD what as radix trends away from the dollar the system can compensate for that increase in value by reducing the cost of a fee to below one one one centimetre oh yeah I mean a lot of this is related to some of underlying low volatility of the radix token and so that's another big reason why we think that the local utility economics is really important
18:44 our next question, Since you're based in UK, how are you handling the current regulatory framework regarding coin 'offerings'? Did you find a way of dodging them?
so we're very interested in working within the regulatory framework that exists in UK I can't say that we're definitely going to choose the UK as the jurisdiction from which the Redax
foundation and the starting of the network is going to be done but ultimately if you want to create a global currency in a global platform you want to be working within the regulatory framework because that's where most of the value lies that's where most of the people are operating and ultimately as I often say this regulation was put in place to protect the consumer to protect the end-user to make sure that things are not done that ends up cheating people and so it is really important that making as much as we're doing compliant now that's not completely possible because some of the stuff that we're doing is actually pushing very much into new territory and so that sort of like there's the part of it is like we think that this regulation applies and this is how we're going to comply with it and then there's a bit where we're saying okay well there isn't
really regulation or this is still grey and how can we make sure that we're not putting people at risk when we're doing this element of the system so yeah it's a very good question it's something that we're spending a lot of time looking into we haven't really been able to take full opinions on that until the economics paper was finished because the economics paper is a big dictator of what applies and what doesn't but now that's coming to an end the next big piece of work that we're going to be undertaking it is really looking at how regulation applies to that economics and so when the economics is released at the end of January that will very much be a draft and and how we think we can achieve those aims and in part that's going to be conversation with the community and and and the world in general just going like does this work, are the people happy with this, does any part of that's going to break and then the other side is going well how's this business of regulation how can we make sure that we
that we long term can be a core piece of infrastructure for everyone.
21:00 I was wondering what kind of business models you have envisioned so far for builders on Radix? How can a builder monetize?
Yeah so one of the one of the main sort of features that was that are in our go live is this ability to easily create tokens and create tokens instead of very smart contract like an ERC 20 or
ERC 71 it's it's just having those as API driven functions a key expansion of that is what we call a continuous mint fungible Token i'm like an ERC twenty where you have to say right there are ten million of these and there's no anymore you can say okay here's the token that i want to create and let's say it's a dollar token and anytime the issuer wants to mean more
they're able to make more now within a purely crypto economic system that doesn't really make a huge amount of sense often because people be like it’s centralized that's a point of failure
but for when you're trying to represent real assets within a system that's a really important function because you know I can't dollar I can't tokenize all of the dollars simultaneously I can only tokenize any of the dollars that under the control of the custodian that's holding those assets and so when more dollars come in you want to be able to continually print more dollars or you know if you're doing equity or if you're doing bonds or if you're doing any of these things so we see a key part of our business enabling model for the first for the MVP go-live
is basically bringing as much of the real world economy real world value into it
okay awesome it's fundamentally scalable enough to equal three birds or something like these were something like MasterCard or indeed something like an exchange but the ultimate decentralized fashion which allows you know which takes advantage both of the ease in which you can integrate the real world into radix using these fundamental tools that are very easy to do using the API functionality plus the massive scalability of the underlying system.
23:08 How many years after launch do you predict we can make radix transactions fully private anonymous?
Thanks Vlad for that one so the problem with with privacy and anonymity and scale is they're not very good friends it's very difficult to make something scale and for it to be truly
private without centralization without centralization rather that's the key
Thing well he heard me talk about the five
things on on the must-have list that I put together a number of years ago now and there was a 6-1 kind of floating around that was it was privacy and this
is back in 2012 whatever it was and I spent a long time trying to trying to trying to marry number six privacy with number one scale and everything I tried just didn't want to work together it was
it was it it just didn't want to play ball I won't go into the technicals of why that wasn't what I tried and after a while I basically came to the conclusion there there's no point having privacy if
he can't scale so scale is the one or two Pressman now obviously since then privacy does float around here somewhere in the back of my mind and I do have a few ideas that we can maybe leverage to implement their privacy so what about the point well my the myrrh and have stopped for now cuz there's a bunch of other stuff that kind of needs my attention is we can maybe do something like stealth addresses and Peterson commitment so you can you can create a
stealth address for your funds and then you can mask the value of the transaction that you're sending right so so if you really do a lot of digging you might figure out that I sent something
to piers but you're you were able to discover how much it was the less active which for now is kind of the point whereby to to try and do anything more more more kind of complex and technical than that you need things like it's a case snarks and all this kind of stuff but they're very big they're very heavy and they're very tab consume you know I know there's a lot of work going on in the industry you know to try and reduce the size of said case narcs and to increase the here the efficiency of the calculation and their verification and all this kind of stuff but I think even the best at the moment gasps that's a lot that's a lot of data going around the network and that data load down captures your scalability because you're saturated in words backwards by so in the moment I don't really have an answer as to when is something that I do think will become important as radix is adopted for both technical and these but right now we've
got bigger fish that we need to finish fry and then we can move on to previous
but it definitely is something that I'm gonna I'm going to move back to looking into you know once we past MVP and we're a bit further down on roadmap so I would I would I would I would hedge that within three years there is some form of privacy within radix that is suitable and good enough for most use cases.
26:52 In a blockchain you can trace coins back to a genesis transaction to prove they are legit. how can you prove legitimacy of coins in tempo? And with pruning?
okay so it's no different really it's the only difference is that in a shard in network you have to approve that audit trailers a little bit more work on behalf of the person that wants to prove it I won't get into all of the heavy theoretical about how nodes can be sure that those coins or those tokens every seeming to be spent have not been double spnt that that's a completely different different kind of conversation and it would take the entirety of the AMA to go into that into any detail but like that kind of stuff will be will be detailed and and the future security papers and the papers about tempo and the improvements that have happened within tempo over the past six to eight months so you can I'll be token then you basically have to follow that token across the shards that is to destruction upon and that information about where which shot did
this token conform and where is it going to it's contained in each of of the atoms to deliver that transaction so you can trace it back it's just in a very shortly network you you ask one node and
then and then you don't ask another way node and the Aten tells you you know which nodes you have to go and ask because the earthing tells you which shard it came on and that field you know which knows to go and ask because they communicate with shards they have so it's a little bit more time-consuming it's a bit more friction because all all tempo really cares about is where did this token come from last doesn't care about where it originally came from it just cares about whether it come from from last time the security model underneath and how how the consensus operates deals with making sure that that's consistent across all transactions workshops now an interesting thing with economics kind of comes into play which again a lot of you are obviously missing huge caps about the economics but the everybody I imagine knows by now that radix takes a kind of supply and demand approach to to the toka month so if you if you think about what that means that means that you have an elastic supply that grows to
meet demand or shrinks to meet the man so when it shrinks this means that somewhere tokens have got to be to be been annexed opens have got to be destroyed which is explained in the paper that you'll all get in January but that means that if it's burned then then the all the trailers are token stops and if supply if demand increases again then new tokens are minted that increase in supply so you actually so if you have a radix token it's tainted somehow right it's been through do a dodgy exchange or it's been stolen or whatever it happens to be one of those turbans that is burnt in a supply reduction when new supply is created none of them are tainted by that that that particular turbans transaction history so so it kind of it kind of
destroys the old and creates the new so that the new tokens are are retain fungibility basically because we see this a bitcoin like we're we're newly minted Bitcoin of worth a bit more than of an old Bitcoin because a lot of my tainted and stuff and there were some before somewhere where it said that something like 70% of inputs currently are actually tainted in some way so you can have this nice recycling effect as well so but in broad strokes down to the question and you can still trace every every token back to it - its minting point - it will necessarily be the genesis because the economics will be MIT will be maintained in Berlin but you will be able to trace them back to their mint even just in a peck the auditors file.
31:06 Why do experienced "blockchain people" find it difficult to understand Radix? What is the main thing that finally makes them go "ah, now I understand"?
I have an answer for this no so you have an answer for this as well, yeah I did, I was first there then sure so so the the answer that I have is often that blockchain experienced blockchain people are often thinking within the constraints of the data architecture that is blockchain so when you're talking about you know huge scalability and they'll always come back to this of the the problem of the blocked in architecture and when you're talking about sharded environments you always come back to the paradigm that is the blockchain architecture so normally what
I find is really helpful when I'm explaining to people who know a lot about blockchain about radix is how the data architecture fundamentally helps the consensus to work and so I start by
just look like a sharding environment it's a statically sharded environment with a
shard space so big that you'll never have to change it eighteen point four billion shards and the way that everything has addressed the way that data is written to that shard space is deterministic it has a deterministic element so if I've got public key my public key determines what shard my fund information lives on so I may live on shard five and Dan may live on shard ten and so I can't spend those tokens from anywhere else but the shard that my public key dictates and therefore my ability to sign a transaction and prove that I control those tokens from anywhere but shard five and Dan can't do the same with his he has to spend Fromme shot and he can spend to 18 point four quintillion shards but you can only spend from sharply he lives on now this has got two really important and very interesting things that sous-sous Tony why their attic system can scale the first is is you group together related transactions and second is you ungroup unrelated transactions what I mean by that is if I spend a token from my shower chart five and they try and double spend it I don't have to look across eighteen point four quintillion charts to see where that double spend has occurred I only have to look at the shard that my public key determines that I live on and have that detected and that's easily solved with the radix consensus system right the ungrouping of unrelated transactions as equally as powerful in that if Dan sends some money to his mother and I send some money to my mother and I'm sending from shot five new shots sending from shard ten those two tokens aren't related to each other in any way because they can't because they're related to two different addresses on two different shards so what you don't need to have is consensus global consensus you don't have to have consensus of my spend versus dad and spending what those two versus each other we just need to know that Dan spend is valid and my spend is valid and so you get this massive scalability effect where you're not constantly having to correlate all the transactions and have consensus on whether the last thing that's really important about this and again like often you get the our home oment for people understand blockchain is the consensus is a passive consensus rather than active consensus you're not reaching consensus on every
single transaction you're only needing to reach consensus or have a consensus
comparison event when a double spend occurs when a conflict actually arises because this is fault detecting not fault tolerant meaning that if I detect a double spend then the nodes that operate the shards that of relevance that have to come in and create and do a secondary process but otherwise they don't and this means the throughput is incredibly efficient because all you're relying on or the only thing that needs you to reach consensus in a non double non-conflicting environment is a submission of transactions to the network that was very good you've listened very well when I've been on that wide board there however I must spoil your party it and that didn't really answer the question okay you explained what that it says was not my understanding and I think I think that's much more philosophical so when Bitcoin came along Minister - she presented the white paper he did it in the decrypted I undoubtedly a lot of very smart people there a lot of people who are very interested in decentralized and distributed system as many open probably built right but he still had trouble convincing them a best word to be how it worked and I think the reason for that is is it's two things it's knowledge knowledge gap and people are very sticky to what they know right and especially if what they know has any indication that there can be nothing better than get the national bit clean box that lists all that kind I'm doing about rational people want to understand but they people are sticky to what they know and the double spam problem wizard was a problem for 30 years right so it there's also that skepticism of this has been a problem for 30 years and then this random anonymous guy pops up on the internet with a strange name and claims they solved it and even those paper seems to solve it I'm not convinced because it can't have been solved right it's like when when when when a fusion power gets gets on no I believe every line now but relations problem for so long why people pick apart every little detail of that if that have that paper and so it's just knowledge time and patience I think that's all it really is because aside from all the crazy maximalists because they they don't want to understand always like to make a lot of no using you know church' blockchain or whatever the people who are actually interested eventually with enough knowledge time and effort will get it and that's usually what I find is that the more I talk to somebody just because they're really struggling to comprehend it well hold on a minute
this is a big thing and you and you solved it so you need to feed me that I feeling that knowledge and probably feed it to a multiple times before peer it can over comments tickets because there's such an amount of force pushing on that stickiness to do the glue let's go and then at that moment you get there
37:41 Today's card payments are often super fast where you just tap the card and have it approved without a pin. How can Radix competed with that?
okay so with that pin and that's that's a security thing right and that's not good on the pin yeah the speed of performance so there's being okay so first off it cap alright let's just say that it can achieve finality of payments in the same amount though I'm is it for five seconds right well that's the specification right it has to say five seconds or less yeah so so I tell my card when I go and get it get a cost of it I just checked on annexed yeah it's like one two three four five seconds depending on connection and so that is obviously the ultimate goal but in our alpha network we've had a few people that say that that's not another case it's taken 20 seconds for some
people whereas other people seem to it seems to happen in in five seconds or even less right it seems to depend on on who's where and you know remember that the alpha net
was was very early prototypes of consensus gossip all this kind of stuff and it's all changed and it's since since June a significant amount a lot more efficient a lot more performance a lot more secure as well so I'm hoping that the release of the node one of stuff there's a test we're doing up today for internal testing I worked very hard on this that that is that that's much more responsive it tips payments so I'm hoping that that will translate to the larger Network you know when we start to scale it up and yeah that performs and we can we can deal with any latency there is no there's no technical reason why this is it possible so it purely comes down to what seems to be implementation details and just making those things more efficient and squashing the books that can get in the way things like that.
39:57 Will there be a public release of the node client before the end of year?
yes there will and it's going to be very close to the end it is going to be very close I will be very glad to see the back of 2018 because it's been extremely hard work and very stressful and trying trying to kind of build a team and push tech when yeah it's it's it's hard because you bring it in this team that haven't on-boarded and they've got no real knowledge about the platform and you have to bring them up to speed so they can do stuff and still push the tech to keep hold of all of you guys happy yeah it's it's tough what everybody's up to speed now everyone is what they're doing so so it's it's getting a bit easier and I'm able to spend more time focusing on
the things that I need to focus on.
40:50 Piers.. how was your China visit?.. was it Radix related in anyway?.. or just a general look around to see the growth of mobile payments.
It was radix related I was there to see some potential partners in China and also spent some time with some partners that we're going to be building to technology and radix with as well I can't disclose any of those at this moment because it's still not finalized but it was it was very good China as China as a country like fascinates and treats me I did Chinese and businesses as part of my own before your same really scares me as a place to be to be a foreign entity to be a business it can be incredibly difficult to do business there if you're not a native Chinese company and so like it's one of those ones where it's very much our roadmap it's one of the most rapidly urbanizing tech technologically driven countries in the world if it's not on our roadmap it's not honor if there's not in our vision then then we're not we're ignoring a very very important part of the world but it's still somewhere where we're taking very cautionary baby steps.
42:04 Was there any conversation between Vitalik and Robert about Radix tech at Devcon4? Or did he just come up to express his willingness to join the future?
so there probably was a conversation between the Robert because Robert likes to talk along whether Vitalik was listening in I can't quote on that I said that but you know I love him dearly but yes that's his job he won't talk to people so yes the public was conversation and how did he
just to be honest I wouldn't I wouldn't actually want vitalik to to join the Radix family and at this point right so and the reason for that is is that vitalik has a lot of reputation about the show is extremely smart guy but because we've taken six years to to do this R&D; and to solve these difficult problems and we're about to take out the door if Vitali was to was to join us in any way right now that would absolutely 100% detract from all all of our achievements both individually and as a team because that potentially then sets the set sentiment of how are you you've
been working on this for six years and then as soon as vitalik joins you've launched new solve all the problems so therefore the catalyst for that must have been him and not actually all
the hard work in time and effort the team is playing so.
43:40 If fee is fixed there will no place for micro/nano payments. Is this what you want ?
I think I kind of hinted a little bit towards this in an earlier question so because the network is so efficient so you know processing transactions and and and all these things is extremely efficient to the point where you can you can one ilanics node on a massive you park right the actual cost are as a node operator to to validate any any transaction or atom or whatever it is that you're validating is extremely low so what that opens the door to is that over time as the load increases the amount of fee charged for transaction as atoms or activity on the network can actually reduce because if you have if you have a node incentive that is potentially too high right so if the if the fee was fixed at one rad sent for example and we were getting a
million transactions a second that's $10,000 a second coming in for a node
one right but those node runners are probably spending a tiny tiny fraction of that on electricity per second so what you'll end up with is an extremely huge network that has very low
powered nodes in in order to collect as much reward as possible for as little work as possible and that is detrimental to the security and overall efficiency of a network with lots of nodes joining in and out and it just it will just meet things a nightmare so in order to combat that kind of kind of gaming I suppose if you like to maximize profit as the load increases you can actually think about reducing the fee and that has an effect of one keeping the number of nodes in the network at a acceptable amount so you know you know what too little but you don't want too many either right but I also means as well that the more loads you get the cheaper it is to do
something on network for the mentality developers to come build stuff because the cost of new
business on radix gets cheaper and cheaper and cheaper as the economy grows and if the economy is growing then the node one is are still receiving an extremely good ROI for their for their time and effort yeah and there's also like there's the whole bootstrapping problem of like look there needs to be a reason for people to put resources to the network and there has to be an incentive for the node runners and usage fee long term is where we want to push the system towards and so that in itself is your is the first thing you want to get right micro transactions nano
payments by their very nature are not pushing a lot of value through the network but they are pushing utility so you need to you need to pick your battles and you start with the sort of a higher value transaction fees first and then you then iterate and as you're able to improve the technology and see how the economics is playing out you can start to adjust things like that as well.
46:47 How Radix gossip protocol determine neighbor peers (fanout) and its size? Is its overlay network a structured or unstructured one?
it's semi structured which is in between both of your suggestion unfortunately because it always is which further complicates the answer so the question is so so that so so there's there's a semi shorty to gossip right and we we opted to go for something that was very simple and quite elegant and that will mitigate a lot of issues and that you find in other ways of doing gossip for all that eh details and stuff but you can poison them and you can do all kinds of things and you have to maintain the the key value set across the network when notes cheer and
they go online and those day just a bit yeah and so what I discovered was is that well if he bought an atom that's going across the network you have a tempo proof with you and in that tempo proof the notes stamp their acceptance or rejection off the atom and they do that by having by creating a signature that represents their know directly right so if you get
anatomy you look at the tempo proof you've got all these node ids in there and if you if you receive that tempo proof now and you're eligible to append to that tempo or proof because it's a
gossip in you are eligible to just and that sample proof then you can deduce from that tempo proof information which they would do around and which ones on right so that's one part of the gossip so the temporal proofs are actually a delivery mechanism to tell everybody in the network who's around now and who's not around right and it's it's fairly accurate but it doesn't need to be super accurate because gossip is fail fast anyway the second component of that is there as a there's a very thin protocol whereby nodes gossip to each other and which nodes they've seen on which IP address they've seen them on so then you can correlate all the node IDs that you
get from the tempo proofs against actual node IPS that you can send you GP packets to when you do the gossip because with those two bits of information and I hope this everybody making sense to anybody this is like super it's about it's about as high level as I can make it without just saying here they met okay so so you have the node IDs from the tempo proofs and you have the node IPs from the PD tables that get communicated around between all nodes in the network and there's there's a heartbeat thing that goes on as well putting against that so then when I want to gossip an event to somebody I can look at all the node IDs that I I can be have
some confidence that are online from mold took all the IPs that I know correlate to those know dragons that I've seen and I can then put that information together and structure it into a gossip routing table right so you have this that this this fanning out gossip table and that there's an origin point at the bottom which is the node that received the atom for submission right and then this gossip routing table is chopped in two layers and each layer is a distance away
from the origin so you end up with is you end up with a gossip routing table structure where each layer is approximately has twice as many nodes in it than the previous one and the
temporal proof represents a binary tree across that gossip region table so it's kind of an overlay network semi-structured because once you pick what layer you need to go to right so if there's five layers and I'm in layer three. I hope that partly answers the question but this is you
know full details of this will be covered in in there in a security paper that we'll be putting out next year there is there's a lot of extra complexity to to the question that you asked it's a good question is an insightful question into the problems that you have in a network but we have thought about these things so like we want to put out specification for that but you know it's just choosing priorities of when those things go out we didn't originally try it completely unstructured gossip protocol so you know just pick random notes they know about center then and then that way fine but is it just meant that there is a lot of a lot of closed loop cycles and in the gossip protocol which slowed it down and made it quite inefficient.
51:50: Can we hear some details about the Radix team Portugal visit?
so we have quarterly planning sessions. The radix team is mostly remote and so we get together once a quarter, we get all together in the same place and we go through the strategic plan for the next three months and make sure everyone's aligned and have the
ability to properly debate and talk about things so that that's what the portugal visit was about and out of that came a lot of the plans that we have for next year and discussions and what the
priorities for the team should be given the number of people and resources.
52:30 What are all the predefined new atom functions?
okay so we're going to start to release some information
about that and so the question will become more clear as we start to really start on that
Stuff. so the atoms and cells aren't redefined the kind of structure of that
model is that you have an atom which is just kind of like a container really right it's like it's like a delivery container like a box yeah and that box carries information about where where this thing is going what shards it's touching on it's also the carrier of the temporal proof is that is it bounces from node to node across the network and building this classic and within the box you've got these things called particles all right any any physicists among you will probably start to get the classics quite quickly and these particles are essentially instructions that we segment in this atom right and you can group particles together so you can kind of do things like that okay so send this to him and then do this and then send up there and then you know sign this thing and send that and you can group them all together so they're easy to manage we can have like you I've got a data particle next to it which is like in English so you can group them together you can pack a lot of different things a lot of different groupings into a single atom and you can just carry it in one go in with more fish and then these particles are composed from you guessed it quarks now the quarks are called bosons there any that kind of stuff it's it's fine we stop there with in a bit silly at that point and these quarks are kind of like they defined the properties right so so a property of a particle could be that it can carry some data and another
property is that it can hold timestamp and another one is they can have a third party signature or it's it's a transfer of a token or you're you're minting a token and these kind of things right so
you have this kind of fairly fairly small set of these properties and you can compose those properties in various different ways according to some some queries to some constraints so you can't put a a fungible token creating (incomprehensible) but you can compose them fairly fairly fairly comprehensively you know things together so you can create these more exotic particles that you can then we'll look into an atom and you send that out them up and they all get executed according to the individual constraints that govern each quark yeah and that's that's basically the super-high level. so yeah I think that's containing a particle at constructions and the properties of the instructions are defined by the quarks and each individual quark defines a
particular property behavior in constraint
55:37 We’ve got five minutes left I'm gonna go through the questions that are in the messaging and then the remaining ones here will get answered and put into the linked into the description of the video when this goes out.
Are any daps committed to build on the radix Network?
yes we've got a couple but they're on the testmet now.
Why is the smallest division of an XRD 0.0001?
That was just how the token was completed in the test net we can actually go to much high precision and that and we will in mainnet.
56:29 Will Radix support a set tokenization example of tangible or financial assets such as property, shares bonds etc and if so programmable compliance and a security token similar to ERC 1400 token etc.
so the short answer is yes the long answer is in several stages so the simple asset tokenization of things like fiat and stuff like that that can be done with the simple fungible tokens that we're going to be releasing at technical go live and then the more complex stuff is dan was talking this composable system of constraints where you can specify whatever constraints you want on the token comes later and but that built into the fundamentals of how the atom model works so like the framework the structure the scaffolding is already there when we go technical go live and then we'll be adding in the ability to do those things and then essentially anyone can compose any system of constraints they need for their token which we think is a much better way of doing things than having like sort of a specific framework in place because the regulatory environments is so different so in different countries that that composability is is actually critical for actually being able to create real world use cases for real-world problems where every single problem has its own sort of like subset of things that have to be solved
58:10 Is the maximum number of particles in atoms related to the max limit to the kilobyte per atom.
So the max particles you can put into an atom is actually given by the UDP packing size limit so an atom can but nothing can be the larger than 64k but its propagation across the network becomes extremely inefficient if you make one bigger than 64k there's nothing stopping you from making one bigger than 64k but the cost to do so will be quite high so you probably want to have a really good reason for making it's that big most of the particles tend to tend to run in 150 to 200 bytes depending I want to do it and that's that's what that there the fields for they're doing as well populated you know to comparators and I count stuff and so you can fit quite a lot of pot of course then to it into it into the bike at if you need more or you biggest you can but then you're not gonna get a final to you five seconds it's gonna take maybe a minute because it has to go over TCP which is a much slower process.
58:00 If I control more than 50 percent of the nodes in in a shard what prevents me from double spending?
If you had 50 percent of the nodes in a shard you can't control the shot this is where all the master comes in that we've talked about quite a lot but we haven't fallen if you find in any paper yet because it's been heavily being worked on over the past six wants to make it extremely efficient and to make sure that it is as secure as possible if you want to overcome a shard then you have to control 50 percent or more ask that all the notes control that's assuming that shot right so if peers is a big known and he's cooking a lot of short space in his masters a million and everybody else in that shard is small peers is massive a million actually contributes to the security of that shard right so if everybody else only had one mass and there was the 99 of
them and then peers vault in with a million for you to control that shard you've got to have over 500 thousand Mass just because of Peters presents so it's not about numbers of knows it's about collective mass backing that shard across a nodes actual complete short space of accumulated yep I suppose the follow-on question would be what happens if I'm malicious humilis yeah so this thread won't be that high I was just me come out as a as a function so the consensus uses quite a magic quadratic mass function so and it if it mass then everybody actually it's it flattens off quite quickly if you take the kind of midpoint of that of that curve then everyone's around about the same amount of mass I mean it's a nice steady incline past the yeah the emit the main attack vector in the network is the mass and the and that will become completely clear and also the security protocols and parameters for that when we release the security white paper later next year but we have a high degree of confidence that it gives a much better degree of security than anything else out there yeah this this is a probably worthwhile so there's there's two interesting components to how the mass works and the one here is that the later you enter the network is a malicious act it the more work you've got to do to catch up with everybody else right so to achieve that fifty percent three years old there'll be a lot of mass already in the network right and you need to achieve 50% of that so it's gonna take you a long time I and the quickly you want to achieve that 50% the larger portion of all transactions the network you've got it you've got it captured there I don't you've got a process let me get the master so it becomes a very difficult thing because you can't just go and buy mass you have it because there's a time component a time component the second thing is is that once the network gets passed a sufficient size there is actually you actually need more mass than there is
available in the network at any moment right so everybody's collected mass over a long time but at any moment in time all of the unspent funds all the tokens that are sitting in these like you know that the Bitcoin outputs for example I it's kind of a bit similar to that they're cumulative mass of all the available funds to spend is an order of magnitude less than all the mass that's
been acquired in the network so far so even you as an attacker you you you can't just go in cash it's a matter certain point the network yeah okay so, I'm sorry we didn't have that question answered we will ask those questions put it into the transcript we're going to hopefully be releasing this before the end of the week it depends on but depends on time so maybe maybe next week but thank you very much everyone for your time it's been great having your questions really insightful thoughtful questions thank you so much and see you guys next year yeah well it may see someone sooner with a node all right exactly, thank you cheers guys.
How many companies are working on creating on/off ramps for the DEX?
Currently undisclosed, but we are in the early stages of working with a few regulated entities already.
What are the risks in Radix in terms of smart contracts compared to ETH i.e some bad coding in ETH smart contracts can make nasty things happen.
The main way you build on the Radix ledger system is not via smart contracts, but instead using our Atom Model. The Atom Model is a flexible system of constraints, a little bit like lego building blocks; you compose together the blocks to create the functionality you want on ledger.
This starts very simple, but will expand to a highly composable system that cannot be composed in incorrect ways (e.g. you cannot put a fungible and non-fungible token quark into the same particle).
All of this serves to make bad code on ledger much less likely, and significantly increases the security for the user.
Is there a minimum number of features the initial (main net) Radix client release should have? Is it possible DEX won't be included?
We will be releasing a full roadmap update at the start of next year, which will include the go-live feature set. The DEX will go live after launch, rather than day 1, to reduce the number of variables being tested at go-live.
Any step by step tutorials on how to build this: https://raddit-icdeskaitv.now.sh/?
We are working on it.
When would be the next time we can meet the team in person? Would love to chat over a beer.
When this bad boy in launched! We are also planning to start doing London Meetups, so if you come to those, that works too :-D
What is bandwidth on test net with 10 000tx/s?
Good question, we did not measure the bandwidth on the network for the last batch of throughput testing, we will do so in the next batch and post the results.
What's a typical working day for Dan? How's he coping balancing work and private life?
It’s definitely been tough, and continues to be tough. When building something of this complexity and magnitude, it often requires an extrodinary effort of concentration and focused determination, which means work/life balance suffers. Dan’s work week currently consists of:
What is the new order for whitepapers we can expect to be released? Eco/Consensus V2/Security/Dex.
Economics on the 31st of January, then the Security White Paper (consensus is one and the same). DEX won’t be a paper any more, just a functional specification, that will be closer to release of the DEX itself.
Does Radix agree with voices (Gupta, Burke) uttering crypto should compromise more with the current institutions? What place does Radix see for crypto?
Not sure if the questioner meant compromise. We do believe that decentralisation is not the right answer for all of the business logic of all businesses. Many of the largest businesses in the world rely on a transition from digital to physical (e.g. an online order becomes shipment of a physical good); it is at these interfaces that we expect many of the crypto use cases will exist, but will exist without being able to be 100% distributed or 100% decentralised. This means following best practice from the existing corporate world where game theory, on ledger logic and crypto-economics is unnecessarily complex.
However, we do not see this as compromise; almost all real business use cases I ever see are messy, complex and nuanced, and very seldom can be broken down to a simple subset of rules.
People argue that cryptos are centralized - centralized mining, centralized leadership, etc. In what way will people say Radix is centralized 2 years from now?
Governance. We have a mandate to make the long term governance of the platform decentralised, but we think that governance is probably one of the biggest unsolved problems for crypto platforms. Either you design it well (give stakeholders a voice) or you don’t design it and governance happens through conflict (hard forks/miner control/vested interests).
To this end, the Governance structure will start of being pretty centralised on the foundation; with the intention to make it as decentralised as quickly as possible, but we will not have got too far down that road in 2 years.
I remember one company started to use radix, can't remember the name but when I google it, looks like no longer exists, any idea about what happened to it?
Surematics? If it was Surematics, that was Piers’ company. It was aquired by Radix and that is how Piers came to be CEO.
I have 2 raspberries and I put them to work to gain mass and won't turn them off and I also have the laptop, when it's on it will run node on it also. They have connected to the internet thanks to a router the question is: can I combine the laptop + rasp and run something over one IP one wallet, from the combined node?
In theory it should be possible to create a "hidden" network on a LAN with with a special "gateway" node that is connected to both the Radix mainnet and the hidden LAN nodes. All gossip and sync traffic will have to go through the gateway node in that case. Note that this is only theoretical, we have not yet tested this topology. When the Node Running client is out, try it out and let us know how you do!
How does the process of a node shards coverage reduction look like?
If the throughput of the shards your node is covering gets too much, then the node reduces the number of shards that it is servicing until it gets to a level it can deal with. As the position of any given node in the shard space is deterministically random (your nodes private key + a POW hashed together give your starting point), the nodes will be fairly evenly distributed across the shard space.
How will Radix raise awareness of the project, will it be through marketing, bounties etc?
Good question - we have a few things up our sleeve for next year, but we wouldn’t want to ruin the surprise :-D
Any plans to integrate radix with merchants for ease of use?
For sure, both digital and physical merchants. This will be a post launch activity however.
Will Radix be available on other Exchanges or DEX’s in the Future or only on Radix DEX?
As Radix is a public, permissionless network, anyone can list the Rad, or any token created on the Radix platform. We do not have an exchanges in the works at the moment, but we are sure there will be some that will list the token.
How many people in the world do you think fully understands Radix at the moment? (I've been following Radix for 2 years and still struggle)
Not a huge number. Maybe a handful actually have the entire picture of the technology and it’s different components. Our knowledge base is our start in trying to make the technology more accessible to outsiders; if you think there is anything else we should be doing, we are definitely all ears.
If there is no ICO, the Radix coin is stable, what are the incentives to buy it? What is the use of Radix coin in the network?
To stay at approximately the same price level, when the price of Rads spikes, the system prints more Rads and re-distributes these. The balance holders are the main beneficiary of this activity. The Rad coin has the same function as that of Bitcoin, both as a store of value (the low volatility makes it a reliable long term store); and to pay for the usage of the network.
Why was the first atomic model shitty and what changed?
The new model is strictly hirachical: a atom contains particles, these particles contain quarks. This gives a clear hirachy for the composition of Atoms and for execution, and it makes the elements extensible. The previous Atom model was less strictly hirachical and led to potential issues when composing them together.
how is it going the economic paper? we will have pow for data transfer and fee for value transfer?
Well! We are pretty happy with it, and looking forwards to publishing it at the end of Jan. Also, yes; that is currently the plan: PoW for data transfer and fee for value transfer.
How will market expansion will look like after the launch?
Difficult to say - there are a few very interesting products and types of products that can be built on Radix, even at the very start, so we are not sure which of these categories will be considered the “killer application” for Radix.
TPB have unlimited fiat to pump and dump. How confident are you that you can withstand attacks from TPB?
Not sure what TPB is, but pump and dump is helpful to the Economics. Will be more clear when the Economics is released.
Is it backed by anything (e.g., electricity/mining?)? Can you explain more on the economics and incentives in the Radix ecosystem?
Good question. This will be covered in detail in the Economics paper that will be out at the end of Jan.
Are you guys still launching closed source, and can anybody run a node? Or is there a limit?
All testnet node clients will be closed source. Eventually, we will open source the core during the main net launch. Yes, anybody can run the node, and there is no limit to participate.