On the marketing side of things Karolina has joined the team as our copywriter.
Very productive month this month. Lucca has developed an interesting tool that generates logical queries automatically. We use them for testing query containment in TML. He also embarked on a small project comparing three theorem provers (Namely, Z3, Vampire and CVC4) finding out that Z3 outclassed all of them but also compares favourably with TML. We continue to work on the performance improvements for TML but it’s currently more comparable with other logical solvers out there. Murisi has worked more on documenting the safe subset of datalog that TML supports implementing additional safety checking and fixing some unsafe code that was generated automatically for the interpreter. He continues his improvements on binary transformation and supporting safe negation. He extended the CQC algorithm with support for arithmetic terms and made it smarter so we can factor out additional terms of rules. He fixed some performance bugs again in TML by improving the hashing of BDDs which in terms improved the efficiency of the caches, Especially for large BDDs. Juan worked on refactoring the arithmetic features as well and continues to work on the Second Order Logic problem. He also worked on the blockchain side and implemented micropayments over the Binance smart chain which would allow Agoras Live to process payments which reduced transaction fees compared to the ETH blockchains. Andrei continues with Agoras Live. Re-designing the front page layout and functionality. He improved the mechanism for public video conferencing sessions and he also integrated crypto micro payments and crypto wallet. Umar continued work on TML parsing fixing many issues in the TML.g regression test and added some type info to the grammar. He also improved the type checking and bit size calculations. He improved macros and fixed some bugs with string handling. Prof Lukas Saul has familarised himself with the project’s internals and has started work on the Agoras Mainnet.
I’m very happy to be a part of this amazing team. My degree is in translation and logistics. For the past 8 years I have been working in advertising doing everything from copywriting, to user experience writing. My job now will be to communicate the amazing possibilities that Tau brings to the world in the very requested layman’s terms. To which our community has been asking for.
This month I created a tool which automatically creates queries in order to have a good amount of queries available to check the query containment problem. This tool is able to generate all queries if we give it a small universe size and a few head variables for queries but these take very long to generate due to there being many different queries. As the universe size gets bigger or you want to have more head variables or anything like that there is the need for random generations so I implemented a random procedure These queries are translatable to two formats. Our TML syntax and and another main format for smt solvers. I did a comparison between Z3, Vampire and CVC4 and found that all three are very successful and fantastic theorem provers. For our particular problem domain it turned out that Z3 was the best performing tool and it comes with an MIT license so we will be able to freely use it in the Tau project. MY next task is to implement that into the process of query containment check which we have already.
This month I worked on safety checking. Making some changes to the algorithm. Improving how it functions in some corner cases. I documented the safe subset supported by TML so users will know from the outset what kind of code will be rejected and supported. I just gave a simple inductive definition in the documentation of formulas which will and won’t be supported by TML. To achieve safe TML, validation of user code is not the only thing necessary. You also have to fix the TML interpreter because behind the scenes when the use types in a certain piece of TML code the TML interpreter generates additional code in order to support certain user features. I stopped TML from generating unsafe code and in respect to safety checking made some regression tests for safety checking to ensure that the future modifications that are made to TML don’t restrict the range of safety supported by TML currently. I focused on and fixed bugs on the interpreter. Improving the hash function of the interpreter I was able to get a large speed improvement of 3x/4x. FOr the Solver I found that having smaller BDDs was important for achieving faster performance. One way I achieved this was to quantify formulas earlier on. I also Improved the performance of the binary transformation by the number of variables of the rules generated during the transformation. I also worked on extending the conjunctive query containment algorithms to support arithmetic terms. So if you have arithmetical terms like x + y = z, you can just encode that as if it were a simple relation. Then the CQC engine is able to recognise containment of queries when they contain arithmetic terms. This change is important as we can now support those arithmetic terms optimising rules that are generated for parsers that contain a lot of arithmetic.
This month I have improved bit size universe transformations so they are done in a simpler and modular way. I’ve looked into TML.g parsing issues. I fixed a few issues related to self parsing of the TML. I also added type information in the grammar so now if you have programs with the type information the grammar can also parse them. Previously, the type checking was using a linear map vector to search for the type. I replaced it with maps data structure now so it’s pretty fast and easy to look for one given type. WIth these changes I made improvements to the type checking and also the bit size calculation. Some bugs were reported in the macro feature so I fixed that and also I did a test for the bit size to universe confirmation. Basically all the regression tests in the intro folder are now runnable and I added more tests as well. Another task that was prioritised by Ohad was to figure out a better scheme where we are currently supporting the type of formulation in the terms with two bits with the lower order that correspond with NUM SIM and CAR so all suggested by using the existing types of the newly proposed type system to use that information. So with those changes now we are able to run the bit size to universe program without lower order type bits included within the BDD. So, that says two bits per BDD. The other task is to, once we have our facts in a bit size to universe, how to recover them and decode them back into the original raw program. This task is now completed. I need to go back and design and refactor it but in terms of algorithms things are accomplished. One of the challenges is how we handle strings and symbols in the bit size to universe. I’ve built a technique which now treats strings and symbols all as a symbol and instead of encoding the entire string content we just use the value you get from the dictionary data structure.
I’ve been interested in two questions. One was how do we deal with the challenge of expressiveness on the side of explaining and formulating aspects on the side of legal reasoning, the law that can refer to itself. We’ve had long debates among the team on the Tau project. It’s interesting, we have a mixture of professors and researchers on board. As soon as they bring in their different angles of viewpoint mixed with the team members we have fruitful discussions and we;ve made good progress in structuring the challenge nevertheless. There is evidence that we move significantly further in structuring the problem we are addressing so we can get closer to a solution but it’s still an ongoing debate. The second is the implementation of an approver for monetic second order logic. This would give us a useful tool to automate reasoning in the Tau project. We’ve had fruitful discussions on pros and cons of solutions towards second order logic and existing theorem improving technology. Also on that side we gain additional insights and are on the way to preparing further solutions in those directions.
I have been working on 3 things. The payment system for agroas live through Binance Smart chain to which I have been testing. It is ready to go into Agoras Live platform. I am also working on the Arithmetic feature; Where arithmetic terms or many expressions can be shared between multiple rules without the requirement of re-generating or having multiple instances of the same arithmetic expression, I am also working on supporting signet integers plus decimal numbers. For the first step they will be fixed point decimal numbers but we don’t discard to address floating point support for our arithmetic or pneumatic system. I’m also looking forward to implementing support for bigger arithmetic expressions. For example summations so more complex problems can be done through TML. I continue to work with the professors and researchers with implementing a number of solutions for second order logic. I also fixed several bugs identified by the team on the first order logic solver.
This month we have been testing the agoras live platform. We’ve added some improvements to the UI and made other changes. Here is a video overview: https://youtu.be/efdSB6xKXlE?t=1742
This month I finished an initial version of the reworked builtins then I continued to work on loading dynamic symbols and mapping TML types to C++ to Hello using shared dynamic libraries from TML. I’ve also started to do more programming in TML; trying to create more examples for others to learn from. By doing so I’ve identified some bugs that are now fixed. I’m revising the account built in for the case when there are more than one variable in accounted term.
This month I’ve been working on the pitch deck which is pretty much in its final stages so pretty soon we’ll be able to actually record as a youtube video for everybody to see and get a better overview. I’ve been working on a tokenomics sheet for agoras token. We are looking to present our project to the Fraunhofer institute [https://www.fraunhofer.de/en.html] in germany and T systems and to form partnerships and funding. I have been working on the executive summary. We have been working on finding a marking partner in the asian region. I’ve been doing some outreach and follow up with some youtubers. Not just youtube cryptos but also Ai focused youtubers. I’ve been looking at some crypto focused tik tokers.
One of the promising youtube candidates are Blockchain Brad [https://www.youtube.com/channel/UCbkjUYiPN8P48r0lurEBP8w/videos]
We’ve been including community team members in the calls with professors. If you’d like to listen, contact myself and we can see if we can make that possible. We’ve been looking at plans for a possible future live stream where Ohad and other members can sit on a panel and discuss interesting and related project topics whilst being asked questions by the community. Tau community members holycapitan and ggmesh have been very active within the tau supporter group!
Tau Supporter Program Info:
The community member of the month for their outstanding contributions is ggmesh!!
Working with the team including Myself, Kilian, Karolina and Ohad to find a single voice to describe what Tau is and to the most amount of people. We’ve been working with Ethan to find a marketing partner in China. I’ve reached out to some people that work with Vechain and L-bank exchange. Once we have the single voice and description of what Tau is we can really push ahead with the design and with the marketing side of things, allowing Karolina to push out content. We’ve also been working on how to visualise Tau and have been looking for a UX designer to make some concepts for Tau. WE’ve been working with our marketing partners and they have provided some ideas of updates to the Tau Logo. We have some really interesting designs for the website and logo but are looking to conclude our description and “voice” before moving feedback on them. We’ve been looking at bigger exchanges and I have found a front end developer. Once we are ready with our design of Agoras we will get him on board. We have added an announcement channel on Telegram. Let us know if you’d like the update video split into 3 videos. Development video, Marketing community business, etc and Q & A video.
I’ve been involved in most aspects of what’s been discussed but also in the proof of execution which Lukas is working on. I’ve also been working on how to make consensus detection decidable with the professors.
Q: Why should I buy Agoras tokens?
Kilian: When you are purchasing tokens now you are doing so from the secondary market, from other community members and market makers. Agoras tokens will be the first coin that is truly controlled by its users and it can evolve over time. It can adopt many different features. It will have utility in different ways over the Tau networks. We have the knowledge economy where you can trade Agoras for formalised and unnormalised knowledge. We’ll have the computational resources market where you can trade agoras from computational resources and well have a derivative market where you can trade agoras, receiving risk free interest without inflation through. Ultimately we are not a financial advisor so it’s always up to you but these are the key features of Agoras.
Q: Will the project survive (or not set back by decades) if the founder isn’t available to work on it anymore? If not, which development milestone will remove that dependency?
Karim: For now Ohad is still the main visionary and the driver of the project. However, we do have a roadmap that’s becoming very clear in terms of goals and implementation. Our timeline is by 2024 we should have the agoras knowledge market place up and running. Once we get there Ohad could consider early retirement but until we get there we need him very much.
Q: What are the tokenomics for Agoras? Can Agoras be the collateral you need to hold in order to make use of Tauchain’s offerings on a tier based structure?
Ohad: Tau will not require any kind of coins. Whether or not it requires any coins will be up to the users. After all it is a self defining system that is defined by its users and I estimate the users will decide so because submitting an opinion that one wants to be taken into account by the whole network and in decisions that affect the whole network like changing Tau itself. If everyone can submit an opinion then there might be just too much computational resources in order to compute the consensus so I would guess it might be the case that people will need to pay some transaction in order to not lose by investing so much computational resources in order to calculate consensus and also to prevent people from sending options which don’t really matter to them.
Q: What will be in the coming months the best way to demonstrate TML capabilities to friends; Agoras Live, the online IDE or others?
Karim: The best way to demonstrate TML’s capabilities is the IDE [https://tml.idni.org/]. It has many built-in samples you can explore yourself in increasing order of complexity and you can write your own code of course. Soon we will have some demos to show the capabilities of TML.
Q: Can we confirm that Tau will be able to use Deontic logic for cyber security purposes? Using this logic we can specify and reason about security policies, such as permissions, authorisations, prohibition and obligations.
Prof. Benzmüller: I have a background in Deontic logics and getting them to run in the framework of higher order logic and have experimented with that good results on that end. In the Tau project we are still discussing whether we want to stay purely classical or whether we want to go the Deontic road also including other non-classical logics that might be relevant in addition to Deontic logics. The options are there. This requires a bit more of an expressive general language framework to start with but the discussion is there and we can go that direction where we consider Deontic as a solution within Tau. Eventually we also try to prevent it by staying in a pure proper classical setup so there is a pro and con debate amongst us and we are not completely decided yet but it is doable in my experience however it is a question of decidability. There are many Deontic logics as well, with pros and cons so still to be discussed further.
Q: Explain the vision, explain the possibilities. Condense the interesting and infinite into something that can be understood by a layman.
Kilian: So our vision can be seen as advancing humanity by scaling collaboration using Logical AI. A particular use case is large scale discussions so currently it’s impossible for a larger group of people to have an efficient discussion but using tau you can imagine the possibility of putting 1000 cancer doctors in a room. Without Tau it would be extremely difficult for all of them to organise their knowledge and have a fruitful discussion. With Tau they could have an efficient discussion and their knowledge could be understood by the network thus one could arrive sooner to the solution to a problem. In terms of tokenomics and example could be we have the knowledge economy so imagine a lawyer who has commercial law knowledge and there is a company that needs advice on the same. The company could automatically query the needed knowledge from the lawyer and the lawyer would receive agoras tokens for providing his knowledge.
Q: Is there already a v2 planned for Agoras Live with features you couldn’t implement in time? If yes, what are those features?
Karim: Yes, We already have plans for Agoras Live 2.0. With enhanced video and payment features. Central to Agoras Live 2.0 is the feature of identity where basically teachers and students would be able to have a verified identity and qualifications to be able to speak on specific topics. Further to this, the request board where users can request what they wish. We plan this for V2. Of course along with a better UI.
Q: What features are missing to launch an MVP of TML and when can we expect it?
Karim: As you can see from the TML IDE, TML is very feature complete. WHat we are working on right now is mostly performance improvements. Second order logic is a task we are still working on but that’s very challenging and we’re not sure or when we will be able to finish that. However, TML 1.0 is expected towards the August time frame.
Q: Wouldn’t a search engine have been a good first candidate to demonstrate the power of Tau? You ask questions and it suggests many answers from different sources. Can something like this be implemented over Agoras LIve in future? What requirement do we lack to implement a search engine using TML over Agoras Live?
Ohad: The search engine featured in the white paper requires fully featured Tau and Agoras because it requires many components. I understand here you talk about a different kind of search engine. This will be in the first versions of the discussion platform and reputation is a very delicate point and I dare to say everyone try to solve the reputation problem and it can be done wrong as you can always give to yourself alot of positive feedback automatically or to give competitors negative feedback so not very reliable however we plan to address this in an ultimate type of solution by identity framework which Karim spoke about.
Q: With regard to the correctness proof function of the Tau blockchain have you looked at the design of truebit as a means of adding additional game theoretic incentive to layer security to the process?
Lukas: Yes, this is an interesting protocol and it looks to be not the same direction that the Tau blockchain will look to go in that the Truebit protocol is a way of adding an economic incentive to enable a sort of cloud computing functionality. With Tau what we are trying to do is using a cryptographic technique to enable verification of computation that we are not trusting on economic incentive to guarantee us correctness of computation but rather trying to verify directly.
Q: On the blockchain scalability front have you weighed the pro’s and con’s with the different solutions such as layer 1 (Sharding), layer 2 (rollups and ZK snark), etc? WHat are some performance targets you aim to achieve for transactions per second, or finality?
Lukas: Good question. Unfortunately we are not really in the position in our architecture decision to design layer 2 solutions. Certainly scalability is a concern but mostly we want to have a functioning layer one first before we try to build layer two on top of it. There are a couple of these layer two techniques which we have some advantage already in terms of being prepared for however it’s really too early to say how we can attack the scalability issues on Tauchain. I don’t think we need a certain number of transactions per second to be functional and I think the number of blocks per minute will also not be an important consideration on blockchain because enabling a lot of the work to continue at zero conf level. These are things to keep in mind but it’s too early to answer that well but thanks for those questions.