Comment on page
📊
Sample Dashboard
This page will help you understand and create a dashboard using the Top Ledger platform.
Top Ledger allows Web3 growth, analytics & data science teams to create powerful dashboards from decoded solana program data. Our backend infra pulls in raw data from the Solana blockchain and labels and indexes them into different tables on a daily basis.
Let’s take an example of a project called ‘Project X’. Project X is a major DeFi aggregator on Solana. We will calculate some important metrics for Project X and add them in a dashboard.
SELECT
--block_date,
--pre_token_balances --write all the coloumns you need
* -- means all
FROM "tl-solana-merged-data"."transactions" --the txns table in the database
WHERE -- write the conditions and filters
AND success = TRUE
AND block_date > now() - interval '10 days'
ORDER BY block_date DESC --order the optput by block date in decreasing order
We will start with calculating DAU/MAU. The below widget shows the MAU or monthly active users for project X.

Project X MAU
Also, in the widgets shown below you can see DAU trend in the last 30 days for the project. On 11th Oct there was a spike in user activity which could be related to some internal or external event.
It also shows the trend of percentage growth of users in the last 30 days.

Project X DAU
Let's go through the SQL code for the widgets you have seen so far. The query runs on transactions table available in the TL platform.
WITH data AS (
SELECT
block_date
,COUNT(DISTINCT account_keys[1]) AS active_users
FROM "tl-solana-merged-data"."transactions"
CROSS JOIN UNNEST(instructions) AS instruction
WHERE 1=1
AND success = true
AND block_date >= date_trunc('day', now() - interval '30' day)
AND (instruction.executing_account = 'JUP4Fb2cqiRUcaTHdrPC8h2gNsA2ETXiPDD33WcGuJB' -- program address of project X
--check in inner_instructions
OR contains(transform(CAST(instruction.inner_instructions AS array<JSON>), x -> json_extract_scalar(x, '$.executing_account')), 'JUP4Fb2cqiRUcaTHdrPC8h2gNsA2ETXiPDD33WcGuJB')
)
GROUP BY block_date
)
SELECT
block_date
,active_users
,(cast(active_users AS double) / lag(active_users) OVER (ORDER BY block_date ASC) -1)*100 AS percent_growth_users
FROM data
ORDER BY block_date DESC
First, we need to unnest the instructions column present in the txns table to filter all the txns where executing account is equal to the program address of Project X. This was required to check for every executing account in an instruction or in its inner instructions. "CROSS JOIN UNNEST(instructions) AS instruction" will explode the instructions array into rows, where each row represents a single instruction.
Now, in order to get all the unique users interacting with Project X, we need to count all the distinct signers (account_keys[1]) on a daily basis and group the data by block date. Remember array index starts from 1 in Presto SQL.
Now let’s calculate how many transactions Project X is doing. We will check the number of total txns per day and how many were successful vs failed. We can infer a lot of insights by relating this metric with real world events and can measure the impact.

Project X txns
Let’s visualise this using charts.

Project X daily txns
Here, we see 4 widgets. 1) successful daily txns 2) all txns 3) cumulative txns chart & 4) cumulative txns. We can infer that the number of failed txns is quite high and volatile but number of successful txns are quite stable. The reason could be bots trying to benefit from arbitrage on the platform.
WITH data AS
(
SELECT
block_date
,count(DISTINCT CASE WHEN success = true OR success = false THEN id ELSE NULL END) total_transactions
,count(DISTINCT CASE WHEN success = true THEN id ELSE NULL END) successful_transactions
,count(DISTINCT CASE WHEN success = false THEN id ELSE NULL END) failed_transactions
FROM "tl-solana-merged-data"."transactions"
CROSS JOIN UNNEST(instructions) AS instruction
WHERE 1=1
AND ( instruction.executing_account = 'JUP4Fb2cqiRUcaTHdrPC8h2gNsA2ETXiPDD33WcGuJB' -- program address of project X
--check in inner_instructions
OR contains(transform(CAST(instruction.inner_instructions AS array<JSON>), x -> json_extract_scalar(x, '$.executing_account')), 'JUP4Fb2cqiRUcaTHdrPC8h2gNsA2ETXiPDD33WcGuJB')
)
AND block_date >= date_trunc('day', now() - interval '30' day)
GROUP BY block_date
ORDER BY block_date DESC
)
SELECT
block_date,
total_transactions,
successful_transactions,
failed_transactions,
SUM(successful_transactions) over (order by block_date asc rows between unbounded preceding and current row) AS cumulative_successful_txns
FROM data
ORDER BY block_date DESC
Here again we had to unnest the instructions column present in the txns table to filter all the txns (whether successful or failed) where executing account was equal to the program address of Project X.
Volume is a very crucial metric for any project building in web3. It is the amount of money that is flowing through its product. This metric matters the most because it's directly linked to the revenue of a project.

However only volume is not sufficient and the projects also need to know the number of unique buyers/traders and correlate it with the volume per day.
In the 3rd widget, we can see that on 11th Oct there was a spike in volume but a significant drop in unique wallets trading.
Let’s go through the SQL code for how to calculate volume for Project X.
WITH
buys AS (
SELECT
id,
block_date,
pre_balances,
post_balances,
account_keys[1] as unique_wallet
FROM "tl-solana-merged-data"."transactions"
CROSS JOIN UNNEST(instructions) AS instruction
WHERE 1=1
AND success = True
AND block_date >= date_trunc('day', now() - interval '30' day)
AND (
instruction.executing_account = 'JUP4Fb2cqiRUcaTHdrPC8h2gNsA2ETXiPDD33WcGuJB' -- program address of project X
--check in inner_instructions
OR contains(transform(CAST(instruction.inner_instructions AS array<JSON>), x -> json_extract_scalar(x, '$.executing_account')), 'JUP4Fb2cqiRUcaTHdrPC8h2gNsA2ETXiPDD33WcGuJB')
)
GROUP BY 1,2,3,4,5
),
buys_summed AS (
SELECT
id,
block_date,
SUM(IF(
post_balances[i] - pre_balances[i] > 0,
post_balances[i] - pre_balances[i],
0
)) * 1.0 / 1e9 AS sol_approx, -- 1 SOL = 10^9 lamports
unique_wallet
FROM buys
CROSS JOIN UNNEST(pre_balances) WITH ORDINALITY AS b (pre_balance, i)
GROUP BY id, block_date, unique_wallet
), vol_raw AS (
SELECT
date_trunc('day', block_date) AS day,
SUM(sol_approx) AS sol,
COUNT(distinct unique_wallet) as unique_wallets
FROM buys_summed
GROUP BY 1
)
SELECT day
, sol
, unique_wallets
, SUM(sol) over (order by day asc rows between unbounded preceding and current row)
AS cumulative_sol
FROM vol_raw
ORDER BY 1 DESC
Here we have first filtered all the txns by unnesting the instructions column and then iterating over pre & post balances where post balance was greater than pre balance.
Also, we calculated the unique wallets doing transactions on a daily basis. Finally we added the daily SOL volume to get the trading volume of last 30 days.
Here is the link to the detailed dashboard for your reference -
https://analytics.topledger.xyz/public/dashboards/wukhYHylZjfiWRWOK3fL59sUicPuWYTa0EL1KQFp?org_slug=default
Last modified 1yr ago