🤖AI Portfolio Optimiser
Last updated
Last updated
Fyde employs a myriad of filters as well as AI and machine learning to make sure that the right assets enter the vault and to protect the assets within the vault. In doing so, users benefit from a pre-selected and pre-vetted whitelist of assets.
Fyde uses several layers of screens for the whitelist process.
First, comes a liquidity and market cap filter to ensure that only tokens that can be swapped by our community without difficulty are added to the vault.
Next comes several fundamental checks, such as founder backgrounds, audits, and more. Before a token is permitted into the vault, we also run a series of technical filters (e.g. price volatility within a recent time period) to ensure that there's no token price manipulation right before depositing.
Finally, we use machine learning on network analysis to identify any additional risks associated with the token.
On-chain transaction analytics are used to create an in-depth look at the blockchain network. This serves as an additional layer of security and a real-time monitoring technique.
The example below highlights one such use case. This strategy uses on-chain analytics to characterise the state of transaction networks of a token or protocol.
The image below shows an unfiltered version of data from a real ERC-20 token. This image captures over 1 million transactions. With wallets (i.e. nodes) connected by transactions (i.e. edges).
AI and ML models are then applied to filter data used within the Fyde AI platform. Wash trading measurement amongst traders and tokens identify the amount of bot driven volume segmented amongst groups - this then feeds directly into Fyde's AI agent framework. In this instance it powers the whitelist decision process.
Pattern recognition and threat detection derived from bot driven trade interactions associated with behaviors meant to bolster trading volume (e.g., wash trading) are used in many different ways.
This on-chain screening methodology enables the Fyde team to gauge the health of a decentralised network, estimating sets of unique users, organic growth trends, degrees of centralisation, amongst other patterns. These metrics are tabulated for protocols/DAOs, serving as inputs for a proprietary risk-scoring model that determines the eligibility of tokens.
Once a token is in the vault, we employ a combination of quantitative indicators, as well as AI and machine learning techniques alongside Hypernative to identify risks before they occur. By leveraging these types of additional AI and ML tools, we can catch potential risks early and quarantine the impacted tokens. Some of the risks that are analysed include governance attacks, liquidity pool risks, transaction risks from certain wallets, and more.
Once tokens are quarantined, users will be unable to interact with them, thereby protecting the rest of the assets in the vault. Because we use oracles for pricing, price impact on a single token does not impact the performance of the other assets in the vault.