Charles: Then a slightly different topic but interconnected is the overall roadmap and where will the technology of Ethereum classic go? What are we planning on doing?
I really believe in this philosophy called Shuhari which is like a martial arts, which means kind of learning, mastering and then innovating. Before anybody earns right to go ahead and say: “Hey, let’s throw this existing system, which has a tremendous amount of critical thinking and trial and error built into it, and builds something new”, they have to prove they know what they’re talking about.
Being a founder of it doesn’t make me credible because I left the project pretty early on relative to its history. From IOHK side, it was important to make any proposals about where we’d like to take the platform outside of conversation pieces like these are the things we could do. We have to demonstrate that we have mastery over the protocol that is only equal by people on the Ethereum foundation side.
This is the kind of the first point of The Grothendieck project. In other words, the proof is in the pudding. We’ve delivered the client. We understand Ethereum at a very elemental and deep level. We have shown our transparency by having weekly stand-ups, public road map, open code. Besides, when we go through the security audit and stuff like that, all these things will also be public so that people understand there’s nothing up to the sleeve. They see the intent.
After that’s done, there will be some proposals that come from our side around late 2017 and 2018. The question is what we can do with Ethereum classic to differentiate it from Ethereum and make it a platform in its own right, which has kind of perhaps different use cases or at the very least a different view of computation.
We’ve already started the research and engagement. We actually have already started collecting a lot of information from just going and talking to great people.
I was in Shanghai recently at the Shanghai winter school. I had a chance to talk to a lot of different cartographers about their research and Ethereum from state channels to new consensus concepts. We’ve had a lot of discussions about what we can do to speed up validation or shard. The state-space blockchain partitioning has been a really hot topic that we’ve been going over. There’s a lot of novel concepts.
What’s really cool is while Ethereum foundation has been focusing on the proof of stake, in fact, we have 10 people working on that every day since it is cool and intoxicating field, it turns out that there’s a lot of innovation in the proof of work world.
For example, there are these proofs of sublinear complexity that Aggelos has come up with which you really need. It seems a lot easier to implement a secure zero-knowledge checkpointing system with a proof of work system than it would be with proof of state system.
We have a better notion of how to do side chains with proof of work than with proof of stake although I think we can eventually reach parity there. I think that there needs to be better PR for proof of work. There needs to be certainly some acknowledgment of its downsides.
McCauley would be the first to tell people that 5 mining pools control hashing power for a Bitcoin. That’s not so good when 5 people control the state. Therefore, we do need to have some conversations about how we would decentralize mining or make it more efficient, but it does seem to be a viable paradigm. I do think it is something that is compatible with smart contracts in general.
However, I think there needs to be a broader conversation like there’s been some great work out of Cornell on trusted hardware. In particular, you’re using Intel SGX to do all kinds of things like smart contracts that are private. There’s a concept called sealed glass proofs where you can prove you know something and sell it to a person. It’s a really novel notion. They also have basically like Augur enabled by trusted hardware. It’s called the Town Crier.
We’re just at the beginning of using these types of frameworks, but they’re very well proliferated because every Intel chip has them now. ARM has an equivalent notion with its secure hardware zone.
When you start talking about having every cell phone and every computer have these special modules, you have a very that ??? distribution of hardware. This can be a second layer removing some trusts or speeding up computation.
I do think there needs to be a more meaningful conversation about where trusted hardware’s role is in our technology and what we can do with it. However, I am pretty excited about the notion of sticking with mining and perhaps doing something like a two-stage protocol.
I am also really excited about the notion of innovating on the friending side. I think we would be very stupid as a community if we passed on the opportunity of trying for a treasury. That’s like the ultimate repudiation of the whole notion of centralized funding to a foundation.
There is one model that you have an ICO and raise all this money. It’s a good model. On the other hand, the problem with that model and no one has fully addressed this is dealing with the centralization of that money. You only have one decider. If that person’s like good, honest, fair and wise, maybe they make great decisions. If they turn out to be like Donald Trump for something like that, your whole ecosystem may go a little crazy.
Therefore, it’s important to say that the capital is decentralized. There’s some way that more than 1 actor can have access to that at some point. The tech is there for this. It just needs to be properly packaged, validated and assembled. As a company, one of our priorities is to do a very rigorous research on it, so we’re going to be presenting that. Our hope is we can get the Ethereum classic community to get behind it.
The last point is about formal verification. This is one of the most boutique and exotic topics in development and it’s not commonly done by most of the developers. If you go to a full stack web dev and say where your proof for this is, they probably weirdly look at you because it’s just not a common notion. However, it is current academia especially in PLT when you’re talking about functional programming.
It turns out that there are like 20 years of the theory that has really advanced things and made the tools a lot better and speed things up a lot, but that theory hasn’t worked its way into the industry. One of the things that IOHK is extremely focused on is basically how we are going to build a pipeline, where we can not only write good functional code but also make that work well.
It’s amazing we found a way to build a cryptocurrency in Haskell. That was not easy. We’re almost there, but that’s not good enough because Haskell is just Haskell. We want to do formal verification as well because it’s like the spinal tap of cryptocurrencies. We have to take it up to 11.
Therefore, the formal verification component is really an interesting discussion, but you have to have an ontology of what’s meaningful, what you should verify, what you should talk about or what you should you give.
Obviously, there’s the crypto and a lot of discussion about the network stack and consensus. These are 3 low-hanging fruits. There’s already been some great work Verdi Coq and so forth.
That is the reason why we’re retaining a really really super good, super specialized firm to come in and help IOHK learn how to do this well.
Our hope is that we can take some of those techniques, methods and bring them into the Grothendieck Team in June. Firstly, we’ll start putting those methods and techniques into our development practices so that that client we produce will be the first formally verified at the Ethereum classic client or Ethereum client general. Secondly, we can take those techniques and have them work their way into the smart contract stack, which I think is much more meaningful.
The good news in that part is we’re not alone. There are actually already been some great papers like there was an interesting paper out of Chalmers University which said how we use dependent typing in smart contracts. It was very well written. Besides, Yoichi Hirai was working with the Ethereum foundation. He’s working at verifying the EVM byte code.
I think there’s a lot of great progress that’s being made and some good thoughts are starting to be put into it. It’s just kind of scattered. Our hope is over the next 6 months, we can unify all of it, figure out how to make it work well, get it into our development pipeline and then eventually build some tools that smart contract developers can actually do this.
I don’t think all smart contracts require formal verification. Honestly, it’s probably going to be library level work. The stuff that Zeppelin is doing or if you’re going to do a DAO or something like will be the standard in the community, but then your everyday smart contracts will probably continue to use solidity, which is a good language and there’s good tooling for it.
Basically, that’s what’s coming like analysis of proof of work and how we can make it more efficient, transparent and fair; the idea of overlaying trusted hardware into the network and capacity, then try to understand what that does to the network and if it increases democracy in the system or not. Plus, we will have a really deep analysis of decentralized funding and then finally bringing formal verification both into the client itself as well as the smart contracts if the client runs on.
The research is ongoing, so we expect to have the outputs on the second half of this year. Our hope is to take that research from white paper into production sometime in Q1 to Q2 of 2018.