Consuming Data

Consuming data streams is possible via a growing number of methods, with the first two being an API for use off-chain, and an Oracle for use on-chain. In some cases, permissioned individual entities can receive access to a node’s database and consume data directly.

While an API gateway to the TSN for off-chain use may appear to expose challenges in verifiability, the TSN is designed to yield cryptographic proofs associated with the data queried. The API enables access to the TSN and is designed to forward this proof alongside any data that is returned from the TSN. Through this mechanism of signed API responses, a consumer can maintain verifiability when integrating with any decentralized system, such that the system is capable of verifying the signature and payload, composed of the associated query response. While the TSN makes an effort to optimize for compatibility across various systems, third-party decentralized computation providers may be necessary to facilitate cryptographic verification and integration between the TSN and designated systems.

To alleviate the complexities in integrating the TSN with existing prominent and monolithic blockchains, the TSN involves an Oracle that enables Smart Contracts to call on this real-world data made available by the TSN without the Smart Contract developer needing to operate an off-chain component. Despite this being only compatible with a set of blockchains, it is necessary to ensure that DeFi protocols and other on-chain applications can call on verifiable data facilitated by the TSN. An Oracle is necessary due to the nature of Smart Contracts [6]. Smart Contracts only allow for the execution of computation invocated from an externally owned account [7] (EOA) or an external Smart Contract within the same blockchain. Due to this, Smart Contracts do not have direct access to third-party APIs or off-chain data streams, hence the incorporation of a dedicated fault-tolerant Oracle.

Our typical data consumers fall into the following 4 categories: Institutions, Companies, L1 and L2 Blockchains, DeFi Products, and Traders. Each of these entities requires access to data across different periods of time, different availability, and different environments. Our resilient architecture enables the TSN to extend its functionality to support data access by any entity with retrospective access to historical records.

Access Control

Authentication and authorization processes are handled by the TSN in a manner relevant to the strategy used to consume the data.

When engaging the API, authentication is first established by leveraging a Web3-native cryptographic wallet to yield a signed message, formally identifying the wallet, and in response, an access token is generated and granted to the authenticated user. Through this authentication mechanism, the TSN can identify the wallet and ensure appropriate funding of relevant requests made to the TSN.

When engaging the on-chain Smart Contract powered by the Oracle, the invocating Smart Contract, or EOA must be appropriately funded to facilitate the request for data within the context of the blockchain executing the invocation.

Access control, in this manner, ensures that engagement with the TSN is secure and compensated and that compensation occurs at the point of value exchange.

Image 3: A depiction of the sequence of events that occur when facilitating an authenticated request to the TSN.

Analytics

The TSN, through its nature of combining blockchain-related decentralization with relational database management and composability, is capable of enabling an analytical interface for data consumers. To do this, direct queries through the API interface can yield responses based on SQL query logic pre-determined within the relevant Adapters that the API request has specified. Alternatively, for larger analytical workloads, full node operation can expose access to the entire data storage layer, such that the data can be queried directly. While the latter approach is more cost-efficient, directly querying a single Node’s database does not yield the verifiable proofs produced by the network, and it may also include additional appropriate authentication and permissions.

Data Management and Practices

To understand how data is managed within the TSN, it is important to conceptualize the TSN as a buffer between the Data Provider and the Consumer, whereby both participants are engaging in a fully asynchronous manner. While Data Providers commit verified and quality-controlled data, Data Consumers can simultaneously call on Adapters, via relevant interfaces to query this data. The TSN, being an app-chain, determines the correct data to serve queries based on the latest finalized block. Ideally, as previously highlighted, the finalized block is reached as soon as ⅔ of the Nodes have synchronized. This is enough to confirm that the data has not been corrupted while being delivered into the consuming stream.

Considering that the TSN advances through finalized blocks, such that each block contains the new data provided, the technical throughput of the network is an active and ongoing matter given attention to prevent data consumers from receiving stale data. This is solved through arbitrary limits to the size of compute and data processed by Adapters, however, to optimize these limits, the core development foundation behind the TSN aims to guide participants on steps to take to alleviate failed data submission and to support these optimizations.

  1. External data source (Data provider) - To uphold timely and accurate data provisioning, such that data consumers are receiving fresh data, TSN data providers are advised to install monitoring systems on their nodes. Nodes can: a) track request/response times to other core TSN nodes; b) track performance metrics and; c) establish mechanisms to prevent stale data and observe potential bottlenecks.

  2. Data processing - Depending on the deterministic logic that an Adapter may need to execute, the duration of the operation can vary. While Adapters are designed for high throughput, logic is advised to be computationally efficient to prevent bottlenecks in the event of a network or Adapter execution failure.

  3. Stream creation - Adapters embedding queries for enabling streams should involve testing paradigms where variable parameters are provided to evaluate the time and computational efficiency in yielding responses.

These guides, while simply an indication of what is expected when developing Adapters, or becoming a Data Provider, aim to establish a standard for engaging the network. It can be comparable to unit tests advised in Smart Contract development for other prominent programmable blockchains [8].

Last updated