Monday, November 11, 2024
BigQuery GCP Juara GCP

Using BigQuery and Cloud Logging to Analyze BigQuery Usage

usage

“Using BigQuery and Cloud Logging to Analyze BigQuery Usage”

Daftar Isi

Pengantar

Cloud Logging memungkinkan Anda menyimpan, menelusuri, menganalisis, memantau, dan memberi peringatan pada data log dan peristiwa dari Google Cloud termasuk BigQuery. Cloud Operations juga menyediakan kemampuan untuk mengekspor log tertentu ke sink seperti Cloud Pub/Sub, Cloud Storage, atau BigQuery.

Praktikum

Task 1. Open BigQuery

  • buka Navigation menu > BigQuery.

Task 2. Create a dataset

  • Di bawah Explorer
  • Klik Create dataset
  • tambahkan id : bq_logs
  • Klik Create

Task 3. Run a query

SELECT current_date

Task 4. Set up log export from Cloud Logging

  • Masuk ke Navigation menu > Logging > Logs Explorer
  • di Resource, select BigQuery, then click Apply.
  • Klik Run query

Create sink

Isi field

Sink name: JobComplete and click NEXT.
Select sink service: BigQuery dataset
Select Bigquery dataset (Destination): bq_logs (The dataset you setup previously)
Leave the rest of the options at the default settings.
  • Klik Create

Task 5. Run example queries

bq query --location=us --use_legacy_sql=false --use_cache=false \
'SELECT fullName, AVG(CL.numberOfYears) avgyears
 FROM `qwiklabs-resources.qlbqsamples.persons_living`, UNNEST(citiesLived) as CL
 GROUP BY fullname'
bq query --location=us --use_legacy_sql=false --use_cache=false \
'select month, avg(mean_temp) as avgtemp from `qwiklabs-resources.qlweather_geo.gsod`
 where station_number = 947680
 and year = 2010
 group by month
 order by month'
bq query --location=us --use_legacy_sql=false --use_cache=false \
'select CONCAT(departure_airport, "-", arrival_airport) as route, count(*) as numberflights
 from `bigquery-samples.airline_ontime_data.airline_id_codes` ac,
 `qwiklabs-resources.qlairline_ontime_data.flights` fl
 where ac.code = fl.airline_code
 and regexp_contains(ac.airline ,  r"Alaska")
 group by 1
 order by 2 desc
 LIMIT 10'

Task 6. Viewing the logs in BigQuery

  • Masuk ke menu Navigation menu > BigQuery
  • Klik Compose New Query di Query editor
CREATE OR REPLACE VIEW
  bq_logs.v_querylogs AS
SELECT
  resource.labels.project_id,
  protopayload_auditlog.authenticationInfo.principalEmail,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobConfiguration.query.query,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobConfiguration.query.statementType,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatus.error.message,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.startTime,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.endTime,
  TIMESTAMP_DIFF(protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.endTime,           protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.startTime, MILLISECOND)/1000 AS run_seconds,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.totalProcessedBytes,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.totalSlotMs,
  ARRAY(SELECT as STRUCT datasetid, tableId FROM UNNEST(protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.referencedTables)) as tables_ref,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.totalTablesProcessed,
  protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.queryOutputRowCount,
  severity
FROM
  `<YOUR-PROJECT-ID>.bq_logs.cloudaudit_googleapis_com_data_access_*`
ORDER BY
  startTime
  • test
SELECT * FROM bq_logs.v_querylogs

Penutup

Sahabat Blog Learning & Doing demikianlah penjelasan mengenai Using BigQuery and Cloud Logging to Analyze BigQuery Usage. Semoga Bermanfaat . Sampai ketemu lagi di postingan berikut nya.

(Visited 149 times, 1 visits today)

Similar Posts