July 29, 2024
Data sovereignty is the next great fight. If we are going to store value and transmit value via ownership in Real World Assets (RWA) then we either need the world's largest referenceable database or we need a decentralized data solution. This can not be static and must be able to have short interval zk audits(daily), as well as data sharing with context, attribution, and trust.
This is important because it is not just about what is on the ledger, but what the instrument on the ledger represents, it is a unit, interest, share, of a legal thing that is understood in the jurisdiction in which it is traded. If this is just a unit of a publicly traded thing Pyth Network + the Edgar Database is enough, or use Chainlink. But if this data is private and you want to control it in your own systems according to your data retention and data destruction policies, ensuring your GDPR compliance, you need to control your own data...
This data will not be static. One BTC is one BTC, this single variable, where the decimal is(floating point), is easy, but if there is entropy in the asset you need real time surveillance of the asset to understand the performance of the asset. In many cases this sets off a flow of calculations or work flows, ie determining the Gross Asset Value, then running a waterfall, then the carry calculation, valuing the cap stack, so you can trade the cap stack, informing a risk book, feeding your eFront ....this can not be done with 10 finger automation and excel as your middleware.
In most cases this unstructured data sits at the edge (not in one big data lake). This is were the massive advancement of AI can transform the largest data set not yet processed thoroughly by big data...private markets. In your own data systems, using Inveniam you can anchor your data to any of 14 chains, providing proof of state of data, then you can use OpenAI or Anthropic to chunk the data, or vector embed the data and this is stored on a weaveate database in the client's environment, not on Inveniam. This allows your data to be interrogated by AI in a fully homomorphic manner with the full value of any AI inference done at scale with out losing your ability to have total co trol and Proof of Process of the data, as you use the world's largest commercially available ai tools, such as Microsoft and G42 Inception tool for the inference. As new AI tools spring up you just add them to the list of tools you use against specific data sets, with context, attribution, and trust.
Decentralized Ledger
Decentralized Compute
Decentralized Data
This powers private funds becoming interval funds with daily AVM on assets in the portfolio, and monthly 3rd party marks at 5% of the current cost. Cushman & Wakefield. This will bring daily subscriptions and weekly redemptions on interval CRE funds, PE, Infrastructure, and Private Credit. This will drive Derivatives for Alternatives, with data sovereignty.