AWS Data Exports でエクスポートした Cost and Usage Reports 2.0 (CUR 2.0) データを使って詳細な AWS コスト分析をする場合,よく聞く選択肢として Amazon Athena と Amazon QuickSight の組み合わせがある.個人的にクエリ部分で Amazon Athena を使わずにもっとカジュアルにコスト分析ができたら良いな〜と思っていて,最近 DuckDB を使っていてイイ感じに運用できている👌
AWS Well-Architected Cost & Usage Report Library
CUR 2.0 データからコスト分析をするときのクエリ例を探すときは「AWS Well-Architected Cost & Usage Report Library」が参考になる📝サービス別のコスト確認クエリやコスト最適化のための分析クエリなどたくさん載っている💰️(一部 CUR 2.0 に対応していなかったりするけど Legacy CUR を読み替えれば OK👌)
今回はサンプルとして AWS Well-Architected Cost & Usage Report Library に載っているクエリを2つ DuckDB で実行してみる❗️
「AWS Well-Architected Cost & Usage Report Library」っていうドキュメントがあったの知らなかった💡ちょうど最近 Cost and Usage Report (CUR) 2.0 の設定・分析などをしてるんだけど「よく使う分析クエリ集」が大量に公開されてて使える〜 \( 'ω')/https://t.co/DV21wKFdjr
— カック (@kakakakakku) 2025年1月2日
1. Amazon S3
まずは AWS Well-Architected Cost & Usage Report Library - Storage に載っている Amazon S3 のコスト分析クエリを試す.Amazon S3 の使用状況を詳細に分析できる👌
以下にクエリライブラリに載っている Legacy CUR から変更した部分をまとめておく📝(あとブログ用に件数を絞るため LIMIT 10
を追加した)
DATE_FORMAT()
ではなく DuckDB でサポートされているSTRFTIME()
にした- プレースホルダ
${table_name}
は Parquet 形式の CUR 2.0 データ'kakakakakku-cur-00001.snappy.parquet'
にした - プレースホルダ
${date_filter}
はSTRFTIME()
を使って 2025/01/01 ~ 2025/01/14 の期間にした
SELECT bill_payer_account_id, line_item_usage_account_id, STRFTIME(line_item_usage_start_date, '%Y-%m-%d') AS day_line_item_usage_start_date, line_item_resource_id, line_item_operation, CASE --S3 Early Delete WHEN line_item_usage_type LIKE '%EarlyDelete-ByteHrs' THEN 'Early Delete Glacier' WHEN line_item_usage_type LIKE '%EarlyDelete%' THEN 'Early Delete ' || SPLIT_PART(line_item_usage_type,'EarlyDelete-',2) --S3 Requests WHEN line_item_usage_type LIKE '%Requests-INT%' THEN 'Requests INT' WHEN (line_item_usage_type LIKE '%Requests-Tier1' OR line_item_usage_type LIKE '%Requests-Tier2') THEN 'Requests Standard' WHEN (line_item_usage_type LIKE '%Requests-GLACIER%' OR line_item_usage_type LIKE '%Requests-Tier3' OR line_item_usage_type LIKE '%Requests-Tier5' OR line_item_usage_type LIKE '%Requests-Tier6') THEN 'Requests Glacier' WHEN line_item_usage_type LIKE '%Requests-GDA%' THEN 'Requests GDA' WHEN line_item_usage_type LIKE '%Requests-GIR%' THEN 'Requests GIR' WHEN (line_item_usage_type LIKE '%Requests-Tier4' OR line_item_usage_type LIKE '%Requests-SIA%') THEN 'Requests SIA' WHEN line_item_usage_type LIKE '%Requests-ZIA%' THEN 'Requests ZIA' --S3 Retrieval WHEN (line_item_usage_type LIKE '%Retrieval-Bytes' AND line_item_operation = 'RestoreObject') THEN 'Retrieval Glacier' WHEN (line_item_usage_type LIKE '%Retrieval-Bytes' AND line_item_operation = 'DeepArchiveRestoreObject') THEN 'Retrieval GDA' WHEN line_item_usage_type LIKE '%Retrieval%' THEN 'Retrieval ' || SPLIT_PART(line_item_usage_type,'Retrieval-',2) --S3 Storage WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'StandardStorage') THEN 'Storage Standard' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'StandardIAStorage') THEN 'Storage SIA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'StandardIASizeOverhead') THEN 'Storage SIA-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'OneZoneIAStorage') THEN 'Storage ZIA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'OneZoneIASizeOverhead') THEN 'Storage ZIA-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'GlacierInstantRetrievalStorage') THEN 'Storage GIR' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'GlacierIRSizeOverhead') THEN 'Storage GIR-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'GlacierStorage') THEN 'Storage Glacier' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'GlacierStagingStorage') THEN 'Storage Glacier-Staging' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND (line_item_operation = 'GlacierObjectOverhead' or line_item_operation = 'GlacierS3ObjectOverhead')) THEN 'Storage Glacier-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'RestoreObject') THEN 'Storage Glacier-Restored' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'DeepArchiveStorage') THEN 'Storage GDA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'DeepArchiveStagingStorage') THEN 'Storage GDA-Staging' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND (line_item_operation = 'DeepArchiveObjectOverhead' or line_item_operation = 'DeepArchiveS3ObjectOverhead')) THEN 'Storage GDA-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'DeepArchiveRestoreObject') THEN 'Storage GDA-Restored' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation = 'ReducedRedundancyStorage') THEN 'Storage RRS' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation LIKE 'IntelligentTieringAA%') THEN 'Storage INT-AA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND (line_item_operation = 'IntAAObjectOverhead' or line_item_operation = 'IntAAS3ObjectOverhead')) THEN 'Storage INT-AA-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation LIKE 'IntelligentTieringDAA%') THEN 'Storage INT-DAA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND (line_item_operation = 'IntDAAObjectOverhead' or line_item_operation = 'IntDAAS3ObjectOverhead')) THEN 'Storage INT-DAA-Overhead' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation LIKE 'IntelligentTieringFA%') THEN 'Storage INT-FA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation LIKE 'IntelligentTieringIA%') THEN 'Storage INT-IA' WHEN (line_item_usage_type LIKE '%TimedStorage%' AND line_item_operation LIKE 'IntelligentTieringAIA%') THEN 'Storage INT-AIA' --Data Transfer WHEN line_item_usage_type LIKE '%AWS-In-Bytes%' THEN 'Data Transfer Region to Region (In)' WHEN line_item_usage_type LIKE '%AWS-In-ABytes%'THEN 'Data Transfer Accelerated Region to Region (In)' WHEN line_item_usage_type LIKE '%AWS-Out-Bytes%' THEN 'Data Transfer Region to Region (Out)' WHEN line_item_usage_type LIKE '%AWS-Out-ABytes%' THEN 'Data Transfer Accelerated Region to Region (Out)' WHEN line_item_usage_type LIKE '%CloudFront-In-Bytes%' THEN 'Data Transfer CloudFront (In)' WHEN line_item_usage_type LIKE '%CloudFront-Out-Bytes%' THEN 'Data Transfer CloudFront (Out)' WHEN line_item_usage_type LIKE '%DataTransfer-Regional-Bytes%' THEN 'Data Transfer Inter AZ' WHEN line_item_usage_type LIKE '%DataTransfer-In-Bytes%' THEN 'Data Transfer Internet (In)' WHEN line_item_usage_type LIKE '%DataTransfer-Out-Bytes%' THEN 'Data Transfer Internet (Out)' WHEN line_item_usage_type LIKE '%DataTransfer-In-ABytes%' THEN 'Data Transfer Accelerated Internet (In)' WHEN line_item_usage_type LIKE '%DataTransfer-Out-ABytes%' THEN 'Data Transfer Accelerated Internet (Out)' WHEN line_item_usage_type LIKE '%S3RTC-In-Bytes%' THEN 'Data Transfer Replication Time Control (In)' WHEN line_item_usage_type LIKE '%S3RTC-Out-Bytes%' THEN 'Data Transfer Replication Time Control (Out)' --S3 Fees & Misc WHEN line_item_usage_type LIKE '%Monitoring-Automation-INT' THEN 'S3 INT Monitoring Fee' WHEN line_item_usage_type LIKE '%StorageAnalytics%' THEN 'S3 Storage Analytics' WHEN line_item_usage_type LIKE '%BatchOperations-Jobs%' THEN 'S3 Batch Operations-Jobs' WHEN line_item_usage_type LIKE '%BatchOperations-Objects%' THEN 'S3 Batch Operations-Objects' WHEN line_item_usage_type LIKE '%TagStorage%' THEN 'S3 Tag Storage' WHEN (line_item_usage_type LIKE '%Select-Returned%' OR line_item_usage_type LIKE '%Select-Scanned%') THEN 'S3 Select' WHEN line_item_usage_type LIKE '%Inventory%' THEN 'S3 Inventory' WHEN line_item_operation LIKE '%StorageLens%' THEN 'Storage Lens' ELSE 'Other ' || line_item_usage_type END as case_line_item_usage_type, SUM(CAST(line_item_usage_amount AS double)) AS sum_line_item_usage_amount, SUM(CAST(line_item_unblended_cost AS decimal(16,8))) AS sum_line_item_unblended_cost FROM 'kakakakakku-cur-00001.snappy.parquet' WHERE STRFTIME(line_item_usage_start_date, '%Y-%m-%d') BETWEEN '2025-01-01' AND '2025-01-14' AND line_item_product_code = 'AmazonS3' AND line_item_line_item_type in ('DiscountedUsage','Usage', 'SavingsPlanCoveredUsage') AND product_product_family != 'Data Transfer' GROUP BY bill_payer_account_id, line_item_usage_account_id, STRFTIME(line_item_usage_start_date, '%Y-%m-%d'), line_item_resource_id, line_item_operation, 6 --refers to case_line_item_usage_type ORDER BY sum_line_item_unblended_cost DESC LIMIT 10;
実行すると以下の結果が取得できた \( 'ω')/
個人的な検証用 AWS アカウントだからほとんどコストは発生していないけど,リソース別・操作別に細かく確認できるのは便利❗️
┌───────────────────────┬────────────────────────────┬────────────────────────────────┬──────────────────────────────┬─────────────────────┬───────────────────────────┬────────────────────────────┬──────────────────────────────┐ │ bill_payer_account_id │ line_item_usage_account_id │ day_line_item_usage_start_date │ line_item_resource_id │ line_item_operation │ case_line_item_usage_type │ sum_line_item_usage_amount │ sum_line_item_unblended_cost │ │ varchar │ varchar │ varchar │ varchar │ varchar │ varchar │ double │ decimal(38,8) │ ├───────────────────────┼────────────────────────────┼────────────────────────────────┼──────────────────────────────┼─────────────────────┼───────────────────────────┼────────────────────────────┼──────────────────────────────┤ │ 000000000000 │ 000000000000 │ 2025-01-01 │ │ ListAllMyBuckets │ Requests Standard │ 41.0 │ 0.00020500 │ │ 000000000000 │ 000000000000 │ 2025-01-12 │ kakakakakku-sandbox-tfstates │ ListBucket │ Requests Standard │ 39.0 │ 0.00018330 │ │ 000000000000 │ 000000000000 │ 2025-01-01 │ kakakakakku-s3-presign │ ListBucket │ Requests Standard │ 28.0 │ 0.00013160 │ │ 000000000000 │ 000000000000 │ 2025-01-11 │ kakakakakku-sandbox-tfstates │ ListBucket │ Requests Standard │ 27.0 │ 0.00012690 │ │ 000000000000 │ 000000000000 │ 2025-01-01 │ kakakakakku-s3-presign │ PutObject │ Requests Standard │ 25.0 │ 0.00011750 │ │ 000000000000 │ 000000000000 │ 2025-01-12 │ kakakakakku-sandbox-tfstates │ PutObject │ Requests Standard │ 18.0 │ 0.00008460 │ │ 000000000000 │ 000000000000 │ 2025-01-01 │ kakakakakku-cur │ ListBucket │ Requests Standard │ 16.0 │ 0.00008000 │ │ 000000000000 │ 000000000000 │ 2025-01-12 │ kakakakakku-sandbox-tfstates │ ListBucketVersions │ Requests Standard │ 14.0 │ 0.00006580 │ │ 000000000000 │ 000000000000 │ 2025-01-01 │ kakakakakku-s3-presign │ WriteBucketPolicy │ Requests Standard │ 10.0 │ 0.00004700 │ │ 000000000000 │ 000000000000 │ 2025-01-03 │ kakakakakku-cur │ ListBucket │ Requests Standard │ 8.0 │ 0.00004000 │ ├───────────────────────┴────────────────────────────┴────────────────────────────────┴──────────────────────────────┴─────────────────────┴───────────────────────────┴────────────────────────────┴──────────────────────────────┤ │ 10 rows 8 columns │ └──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
2. Amazon CloudWatch
次に AWS Well-Architected Cost & Usage Report Library - Management & Governance に載っている Amazon CloudWatch のコスト分析クエリを試す.Amazon CloudWatch の使用状況を詳細に分析できる👌
以下にクエリライブラリに載っている Legacy CUR から変更した部分をまとめておく📝(あとブログ用に件数を絞るため LIMIT 10
を追加した)
DATE_FORMAT()
ではなく DuckDB でサポートされているSTRFTIME()
にした- プレースホルダ
${table_name}
は Parquet 形式の CUR 2.0 データ'kakakakakku-cur-00001.snappy.parquet'
にした - プレースホルダ
${date_filter}
はSTRFTIME()
を使って 2025/01/01 ~ 2025/01/14 の期間にした - カラム
product_product_name
は CUR 2.0 のカラムline_item_product_code
にした
SELECT bill_payer_account_id, line_item_usage_account_id, STRFTIME(line_item_usage_start_date, '%Y-%m-%d') AS day_line_item_usage_start_date, CASE WHEN line_item_usage_type LIKE '%%Requests%%' THEN 'Requests' WHEN line_item_usage_type LIKE '%%DataProcessing-Bytes%%' THEN 'DataProcessing' WHEN line_item_usage_type LIKE '%%TimedStorage-ByteHrs%%' THEN 'Storage' WHEN line_item_usage_type LIKE '%%DataScanned-Bytes%%' THEN 'DataScanned' WHEN line_item_usage_type LIKE '%%AlarmMonitorUsage%%' THEN 'AlarmMonitors' WHEN line_item_usage_type LIKE '%%DashboardsUsageHour%%' THEN 'Dashboards' WHEN line_item_usage_type LIKE '%%MetricMonitorUsage%%' THEN 'MetricMonitor' WHEN line_item_usage_type LIKE '%%VendedLog-Bytes%%' THEN 'VendedLogs' WHEN line_item_usage_type LIKE '%%GMD-Metrics%%' THEN 'GetMetricData' ELSE 'Others' END AS line_item_usage_type, -- if uncommenting, also uncomment one other occurrence of line_item_resource_id in GROUP BY -- SPLIT_PART(line_item_resource_id,':',7) as ResourceID, line_item_operation, SUM(CAST(line_item_usage_amount AS DOUBLE)) AS sum_line_item_usage_amount, SUM(CAST(line_item_unblended_cost AS DECIMAL(16,8))) AS sum_line_item_unblended_cost FROM 'kakakakakku-cur-00001.snappy.parquet' WHERE STRFTIME(line_item_usage_start_date, '%Y-%m-%d') BETWEEN '2025-01-01' AND '2025-01-14' AND line_item_product_code = 'AmazonCloudWatch' AND line_item_line_item_type IN ('DiscountedUsage', 'Usage', 'SavingsPlanCoveredUsage') GROUP BY bill_payer_account_id, line_item_usage_account_id, STRFTIME(line_item_usage_start_date, '%Y-%m-%d'), line_item_usage_type, -- line_item_resource_id, line_item_operation ORDER BY sum_line_item_unblended_cost DESC LIMIT 10;
実行すると以下の結果が取得できた \( 'ω')/
個人的な検証用 AWS アカウントだからほとんどコストは発生していないけど,使用タイプ別に細かく確認できるのは便利❗️
ちなみに結果を見て「不要な Amazon CloudWatch Dashboard」が残ってるじゃん〜💦 と気付いてすぐに削除できたのは良かった👌
┌───────────────────────┬────────────────────────────┬────────────────────────────────┬──────────────────────┬───────────────────────┬────────────────────────────┬──────────────────────────────┐ │ bill_payer_account_id │ line_item_usage_account_id │ day_line_item_usage_start_date │ line_item_usage_type │ line_item_operation │ sum_line_item_usage_amount │ sum_line_item_unblended_cost │ │ varchar │ varchar │ varchar │ varchar │ varchar │ double │ decimal(38,8) │ ├───────────────────────┼────────────────────────────┼────────────────────────────────┼──────────────────────┼───────────────────────┼────────────────────────────┼──────────────────────────────┤ │ 000000000000 │ 000000000000 │ 2025-01-05 │ Dashboards │ DashboardHour │ 0.032258064 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-05 │ AlarmMonitors │ Unknown │ 0.032258064 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-10 │ AlarmMonitors │ Unknown │ 0.032258064 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-01 │ Storage │ HourlyStorageMetering │ 8.702999999999998e-07 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-09 │ Storage │ HourlyStorageMetering │ 8.702999999999998e-07 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-12 │ Storage │ HourlyStorageMetering │ 8.702999999999998e-07 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-06 │ Storage │ HourlyStorageMetering │ 9.5485e-06 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-09 │ Storage │ HourlyStorageMetering │ 9.5485e-06 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-09 │ Dashboards │ DashboardHour │ 0.032258064 │ 0.00000000 │ │ 000000000000 │ 000000000000 │ 2025-01-14 │ Dashboards │ DashboardHour │ 0.032258064 │ 0.00000000 │ ├───────────────────────┴────────────────────────────┴────────────────────────────────┴──────────────────────┴───────────────────────┴────────────────────────────┴──────────────────────────────┤ │ 10 rows 7 columns │ └────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
AWS CloudShell に DuckDB をインストールする
今回は Parquet 形式の CUR 2.0 データを Amazon S3 からローカル環境に落としてから DuckDB で読み込んだけど,Amazon S3 を直接読み込む場合,AWS CloudShell に DuckDB をインストールすると便利なアドホッククエリ実行環境になるのでおすすめ❗️
$ curl --fail --location --progress-bar --output duckdb_cli-linux-amd64.zip https://github.com/duckdb/duckdb/releases/download/v1.1.3/duckdb_cli-linux-amd64.zip && unzip duckdb_cli-linux-amd64.zip $ ./duckdb D CREATE SECRET ( TYPE S3, PROVIDER CREDENTIAL_CHAIN ); ┌─────────┐ │ Success │ │ boolean │ ├─────────┤ │ true │ └─────────┘ D .maxrows 200 D DESCRIBE SELECT * FROM read_parquet('s3://kakakakakku-cur/cur/kakakakakku-cur/data/BILLING_PERIOD=2025-01/kakakakakku-cur-00001.snappy.parquet'); ┌──────────────────────────────────────────────────────────────────┬───────────────────────┬─────────┬─────────┬─────────┬─────────┐ │ column_name │ column_type │ null │ key │ default │ extra │ │ varchar │ varchar │ varchar │ varchar │ varchar │ varchar │ ├──────────────────────────────────────────────────────────────────┼───────────────────────┼─────────┼─────────┼─────────┼─────────┤ │ bill_bill_type │ VARCHAR │ YES │ │ │ │ │ bill_billing_entity │ VARCHAR │ YES │ │ │ │ │ bill_billing_period_end_date │ TIMESTAMP │ YES │ │ │ │ │ bill_billing_period_start_date │ TIMESTAMP │ YES │ │ │ │ │ bill_invoice_id │ VARCHAR │ YES │ │ │ │ │ bill_invoicing_entity │ VARCHAR │ YES │ │ │ │ │ bill_payer_account_id │ VARCHAR │ YES │ │ │ │ │ bill_payer_account_name │ VARCHAR │ YES │ │ │ │ │ cost_category │ MAP(VARCHAR, VARCHAR) │ YES │ │ │ │ │ discount │ MAP(VARCHAR, DOUBLE) │ YES │ │ │ │ │ discount_bundled_discount │ DOUBLE │ YES │ │ │ │ │ discount_total_discount │ DOUBLE │ YES │ │ │ │ │ identity_line_item_id │ VARCHAR │ YES │ │ │ │ │ identity_time_interval │ VARCHAR │ YES │ │ │ │ │ line_item_availability_zone │ VARCHAR │ YES │ │ │ │ │ line_item_blended_cost │ DOUBLE │ YES │ │ │ │ │ line_item_blended_rate │ VARCHAR │ YES │ │ │ │ │ line_item_currency_code │ VARCHAR │ YES │ │ │ │ │ line_item_legal_entity │ VARCHAR │ YES │ │ │ │ │ line_item_line_item_description │ VARCHAR │ YES │ │ │ │ │ line_item_line_item_type │ VARCHAR │ YES │ │ │ │ │ line_item_net_unblended_cost │ DOUBLE │ YES │ │ │ │ │ line_item_net_unblended_rate │ VARCHAR │ YES │ │ │ │ │ line_item_normalization_factor │ DOUBLE │ YES │ │ │ │ │ line_item_normalized_usage_amount │ DOUBLE │ YES │ │ │ │ │ line_item_operation │ VARCHAR │ YES │ │ │ │ │ line_item_product_code │ VARCHAR │ YES │ │ │ │ │ line_item_tax_type │ VARCHAR │ YES │ │ │ │ │ line_item_unblended_cost │ DOUBLE │ YES │ │ │ │ │ line_item_unblended_rate │ VARCHAR │ YES │ │ │ │ │ line_item_usage_account_id │ VARCHAR │ YES │ │ │ │ │ line_item_usage_account_name │ VARCHAR │ YES │ │ │ │ │ line_item_usage_amount │ DOUBLE │ YES │ │ │ │ │ line_item_usage_end_date │ TIMESTAMP │ YES │ │ │ │ │ line_item_usage_start_date │ TIMESTAMP │ YES │ │ │ │ │ line_item_usage_type │ VARCHAR │ YES │ │ │ │ │ pricing_currency │ VARCHAR │ YES │ │ │ │ │ pricing_lease_contract_length │ VARCHAR │ YES │ │ │ │ │ pricing_offering_class │ VARCHAR │ YES │ │ │ │ │ pricing_public_on_demand_cost │ DOUBLE │ YES │ │ │ │ │ pricing_public_on_demand_rate │ VARCHAR │ YES │ │ │ │ │ pricing_purchase_option │ VARCHAR │ YES │ │ │ │ │ pricing_rate_code │ VARCHAR │ YES │ │ │ │ │ pricing_rate_id │ VARCHAR │ YES │ │ │ │ │ pricing_term │ VARCHAR │ YES │ │ │ │ │ pricing_unit │ VARCHAR │ YES │ │ │ │ │ product │ MAP(VARCHAR, VARCHAR) │ YES │ │ │ │ │ product_comment │ VARCHAR │ YES │ │ │ │ │ product_fee_code │ VARCHAR │ YES │ │ │ │ │ product_fee_description │ VARCHAR │ YES │ │ │ │ │ product_from_location │ VARCHAR │ YES │ │ │ │ │ product_from_location_type │ VARCHAR │ YES │ │ │ │ │ product_from_region_code │ VARCHAR │ YES │ │ │ │ │ product_instance_family │ VARCHAR │ YES │ │ │ │ │ product_instance_type │ VARCHAR │ YES │ │ │ │ │ product_instancesku │ VARCHAR │ YES │ │ │ │ │ product_location │ VARCHAR │ YES │ │ │ │ │ product_location_type │ VARCHAR │ YES │ │ │ │ │ product_operation │ VARCHAR │ YES │ │ │ │ │ product_pricing_unit │ VARCHAR │ YES │ │ │ │ │ product_product_family │ VARCHAR │ YES │ │ │ │ │ product_region_code │ VARCHAR │ YES │ │ │ │ │ product_servicecode │ VARCHAR │ YES │ │ │ │ │ product_sku │ VARCHAR │ YES │ │ │ │ │ product_to_location │ VARCHAR │ YES │ │ │ │ │ product_to_location_type │ VARCHAR │ YES │ │ │ │ │ product_to_region_code │ VARCHAR │ YES │ │ │ │ │ product_usagetype │ VARCHAR │ YES │ │ │ │ │ reservation_amortized_upfront_cost_for_usage │ DOUBLE │ YES │ │ │ │ │ reservation_amortized_upfront_fee_for_billing_period │ DOUBLE │ YES │ │ │ │ │ reservation_availability_zone │ VARCHAR │ YES │ │ │ │ │ reservation_effective_cost │ DOUBLE │ YES │ │ │ │ │ reservation_end_time │ VARCHAR │ YES │ │ │ │ │ reservation_modification_status │ VARCHAR │ YES │ │ │ │ │ reservation_net_amortized_upfront_cost_for_usage │ DOUBLE │ YES │ │ │ │ │ reservation_net_amortized_upfront_fee_for_billing_period │ DOUBLE │ YES │ │ │ │ │ reservation_net_effective_cost │ DOUBLE │ YES │ │ │ │ │ reservation_net_recurring_fee_for_usage │ DOUBLE │ YES │ │ │ │ │ reservation_net_unused_amortized_upfront_fee_for_billing_period │ DOUBLE │ YES │ │ │ │ │ reservation_net_unused_recurring_fee │ DOUBLE │ YES │ │ │ │ │ reservation_net_upfront_value │ DOUBLE │ YES │ │ │ │ │ reservation_normalized_units_per_reservation │ VARCHAR │ YES │ │ │ │ │ reservation_number_of_reservations │ VARCHAR │ YES │ │ │ │ │ reservation_recurring_fee_for_usage │ DOUBLE │ YES │ │ │ │ │ reservation_reservation_a_r_n │ VARCHAR │ YES │ │ │ │ │ reservation_start_time │ VARCHAR │ YES │ │ │ │ │ reservation_subscription_id │ VARCHAR │ YES │ │ │ │ │ reservation_total_reserved_normalized_units │ VARCHAR │ YES │ │ │ │ │ reservation_total_reserved_units │ VARCHAR │ YES │ │ │ │ │ reservation_units_per_reservation │ VARCHAR │ YES │ │ │ │ │ reservation_unused_amortized_upfront_fee_for_billing_period │ DOUBLE │ YES │ │ │ │ │ reservation_unused_normalized_unit_quantity │ DOUBLE │ YES │ │ │ │ │ reservation_unused_quantity │ DOUBLE │ YES │ │ │ │ │ reservation_unused_recurring_fee │ DOUBLE │ YES │ │ │ │ │ reservation_upfront_value │ DOUBLE │ YES │ │ │ │ │ resource_tags │ MAP(VARCHAR, VARCHAR) │ YES │ │ │ │ │ savings_plan_amortized_upfront_commitment_for_billing_period │ DOUBLE │ YES │ │ │ │ │ savings_plan_end_time │ VARCHAR │ YES │ │ │ │ │ savings_plan_instance_type_family │ VARCHAR │ YES │ │ │ │ │ savings_plan_net_amortized_upfront_commitment_for_billing_period │ DOUBLE │ YES │ │ │ │ │ savings_plan_net_recurring_commitment_for_billing_period │ DOUBLE │ YES │ │ │ │ │ savings_plan_net_savings_plan_effective_cost │ DOUBLE │ YES │ │ │ │ │ savings_plan_offering_type │ VARCHAR │ YES │ │ │ │ │ savings_plan_payment_option │ VARCHAR │ YES │ │ │ │ │ savings_plan_purchase_term │ VARCHAR │ YES │ │ │ │ │ savings_plan_recurring_commitment_for_billing_period │ DOUBLE │ YES │ │ │ │ │ savings_plan_region │ VARCHAR │ YES │ │ │ │ │ savings_plan_savings_plan_a_r_n │ VARCHAR │ YES │ │ │ │ │ savings_plan_savings_plan_effective_cost │ DOUBLE │ YES │ │ │ │ │ savings_plan_savings_plan_rate │ DOUBLE │ YES │ │ │ │ │ savings_plan_start_time │ VARCHAR │ YES │ │ │ │ │ savings_plan_total_commitment_to_date │ DOUBLE │ YES │ │ │ │ │ savings_plan_used_commitment │ DOUBLE │ YES │ │ │ │ │ line_item_resource_id │ VARCHAR │ YES │ │ │ │ │ BILLING_PERIOD │ VARCHAR │ YES │ │ │ │ ├──────────────────────────────────────────────────────────────────┴───────────────────────┴─────────┴─────────┴─────────┴─────────┤ │ 115 rows 6 columns │ └──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
まとめ
CUR 2.0 の分析に DuckDB を活用しよう❗️
関連情報
AWS CDK で Cost and Usage Reports 2.0 (CUR 2.0) エクスポートを設定する📝
CUR 2.0 データのカラムの意味を確認する場合はドキュメント参照📝