Skip to content

Conversation

@subfuzion
Copy link
Contributor

This is PR #3823, but using a local branch (required to run tests) instead of a fork.

@subfuzion subfuzion requested review from a team as code owners November 18, 2024 17:21
@product-auto-label product-auto-label bot added samples Issues that are directly related to samples. api: securitycenter Issues related to the Security Command Center API. api: workflows Issues related to the Workflows API. labels Nov 18, 2024
@subfuzion subfuzion changed the title Thokalavinod bigquery samples feat(bigquery): add v2 samples Nov 18, 2024
Copy link
Contributor Author

@subfuzion subfuzion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#3823 already approved.

@subfuzion
Copy link
Contributor Author

@gemini review

Copy link
Contributor

@code-review-assist-experimental code-review-assist-experimental bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @subfuzion, I'm currently reviewing your pull request and will post my detailed review in a few minutes. In the meantime, here's a quick summary for you and other reviewers to get up to speed:

This PR adds BigQuery v2 samples to the nodejs-docs-samples repository. The changes include adding a new test file (security-center/snippets/system-test/v2/bigqueryexport.test.js) with comprehensive tests for creating, listing, getting, updating, and deleting BigQuery exports. Several .js files within the security-center/snippets/v2 directory were modified to reflect the v2 API changes, and the workflows.json file was updated to include the new samples in the build process. The intent is to provide complete and up-to-date examples for using the BigQuery v2 API with Node.js.

While the PR description is brief, the code changes clearly indicate the addition of comprehensive BigQuery v2 functionality. If there's any missing context, please clarify.

And now, a little haiku to lighten the mood:

Code flows like a stream,
Tests run, green lights shine so bright,
Bugs flee in the night.

Copy link
Contributor

@code-review-assist-experimental code-review-assist-experimental bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This pull request adds system tests for the v2 BigQuery export functionality in the security-center. The changes include adding a new system test file, updating the workflows file to include the new tests, and modifying several snippet files to remove unnecessary comments and improve clarity. Overall, the changes are well-structured and address a key area of testing. However, there are a few minor suggestions for improvement.

One suggestion is to consider adding a test to verify the data exported to BigQuery. Currently, the tests check for successful creation, listing, getting, updating, and deletion of the BigQuery export, but they don't validate the content of the exported data. This could be a valuable addition to ensure data integrity.

Another suggestion is to revisit the error handling in the createDataset function. While the function currently handles the case where a dataset with the same name already exists, it might be beneficial to implement a retry mechanism with exponential backoff to avoid potential conflicts and ensure the tests run reliably.

Finally, it's recommended to add a brief description of the v2 BigQuery export functionality to the README file. This will help users understand the purpose of the new tests and how to use the functionality.

"workflows",
"workflows/invoke-private-endpoint"
"workflows/invoke-private-endpoint",
"security-center/snippets/v2"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

This addition correctly includes the new system tests in the workflow. Consider adding a comment explaining the purpose of this entry for clarity.

Suggested change
"security-center/snippets/v2"
"security-center/snippets/v2" # Add v2 BigQuery export system tests

Comment on lines +63 to +71
if (error.code === 409) {
// Dataset already exists - Fail the test instead of moving on
console.log(
`Dataset ${datasetId} already exists. Exiting to avoid conflict.`
);
throw new Error(`Dataset ${datasetId} already exists.`);
}
throw error;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The error handling here could be improved. Instead of failing the test if the dataset already exists, consider implementing a retry mechanism with exponential backoff. This will make the tests more robust and less likely to fail due to transient errors.

Suggested change
if (error.code === 409) {
// Dataset already exists - Fail the test instead of moving on
console.log(
`Dataset ${datasetId} already exists. Exiting to avoid conflict.`
);
throw new Error(`Dataset ${datasetId} already exists.`);
}
throw error;
}
if (error.code === 409) {
// Dataset already exists - retry with a new random suffix after a delay
console.log(`Dataset ${datasetId} already exists. Retrying...`);
await new Promise(resolve => setTimeout(resolve, 2000)); // Wait 2 seconds
return createDataset();
}
throw error;

const client = new SecurityCenterClient();

// Clean up any existing datasets or BigQuery exports
await cleanupDatasets();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Add a comment to explain what cleanupDatasets does.

Suggested change
await cleanupDatasets();
// Clean up any existing datasets or BigQuery exports
await cleanupDatasets(); // Delete datasets starting with 'securitycenter_'

} catch (error) {
console.error('Error creating BigQuery export:', error);
}
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

It's good practice to clean up created resources after the tests are completed. Consider adding an after hook to delete the BigQuery export and dataset.

Suggested change
});
});
after(async () => {
const client = new SecurityCenterClient();
try {
await client.deleteBigQueryExport({name: data.bigQueryExportName});
await bigquery.dataset(dataset.split('/').pop()).delete({force: true});
} catch (error) {
console.error("Error cleaning up resources:", error);
}
});

Comment on lines +29 to +31
// format is "organizations/[organization_id]/locations/[location_id]",
// "folders/[folder_id]/locations/[location_id]", or
// "projects/[project_id]/locations/[location_id]".
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unknown

While the comments are helpful, they could be more concise. Consider removing the redundant information about the format of the parent resource, as it's already implied by the function call.

Suggested change
// format is "organizations/[organization_id]/locations/[location_id]",
// "folders/[folder_id]/locations/[location_id]", or
// "projects/[project_id]/locations/[location_id]".
// The parent resource of the new BigQuery export.

Comment on lines +46 to +49
// Required. Unique identifier provided by the client within the parent scope.
// It must consist of only lowercase letters, numbers, and hyphens, must start
// with a letter, must end with either a letter or a number, and must be 63
// characters or less.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unknown

The comment explaining the requirements for the BigQuery export ID is quite verbose. Consider shortening it to be more concise.

Suggested change
// Required. Unique identifier provided by the client within the parent scope.
// It must consist of only lowercase letters, numbers, and hyphens, must start
// with a letter, must end with either a letter or a number, and must be 63
// characters or less.
// Unique identifier for the BigQuery export.

/*
* TODO(developer): Update the following references for your own environment before running the sample.
*/
// TODO(developer): Update the following references for your own environment before running the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The TODO comment here seems to be leftover from a template. Since the organizationId and exportId are now being passed as arguments to the function, this TODO is no longer relevant and can be removed.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// Build the full resource path for the BigQuery export to delete.

/*
* TODO(developer): Update the following references for your own environment before running the sample.
*/
// TODO(developer): Update the following references for your own environment before running the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Similar to the deleteBigQueryExport.js file, this TODO comment is no longer necessary and can be removed.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// Build the full resource path for the BigQuery export to retrieve.

Comment on lines +29 to 31
// TODO(developer): Update the following references for your own environment before running the sample.
// const organizationId = 'YOUR_ORGANIZATION_ID';
// const location = 'LOCATION_ID';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

This TODO comment is also redundant and can be removed.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// const organizationId = 'YOUR_ORGANIZATION_ID';
// const location = 'LOCATION_ID';
// The parent resource for listing BigQuery exports.

* TODO(developer): Update the following references for your own environment before running the sample.
*/
// Required. Name of the BigQuery export to retrieve.
// TODO(developer): Update the following references for your own environment before running the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

This TODO comment is redundant and can be removed.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// The name of the BigQuery export to update.

Copy link
Contributor

@code-review-assist-experimental code-review-assist-experimental bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This pull request adds system tests for the v2 BigQuery export functionality in the security-center. The changes include a new test file, additions to the workflows file, and modifications to several snippet files to remove outdated comments and improve clarity. The implementation looks good overall, but I have a few suggestions for improvement.

It's good practice to add corresponding unit tests alongside system tests to ensure comprehensive coverage. Consider adding unit tests for the BigQuery export functionality to complement the system tests introduced in this PR.

The README file should be updated to reflect the new v2 BigQuery export functionality and how to use it. This will help users understand the feature and its usage.

Consider adding a section to the README about cleaning up resources after running the system tests to avoid accumulating unnecessary costs.

"workflows",
"workflows/invoke-private-endpoint"
"workflows/invoke-private-endpoint",
"security-center/snippets/v2"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Consider adding a descriptive comment above this line indicating the purpose of this addition, e.g., "Add security center v2 snippets for BigQuery export". This will improve the maintainability of the workflows file.

Comment on lines +29 to +36
async function cleanupDatasets() {
const [datasets] = await bigquery.getDatasets();
for (const dataset of datasets) {
if (dataset.id.startsWith('securitycenter_')) {
console.log(`Deleting dataset: ${dataset.id}`);
await bigquery.dataset(dataset.id).delete({force: true});
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The cleanupDatasets function could be more efficient by using a filter when listing datasets. This would avoid iterating through all datasets and improve performance, especially if there are many datasets in the project.

Suggested change
async function cleanupDatasets() {
const [datasets] = await bigquery.getDatasets();
for (const dataset of datasets) {
if (dataset.id.startsWith('securitycenter_')) {
console.log(`Deleting dataset: ${dataset.id}`);
await bigquery.dataset(dataset.id).delete({force: true});
}
}
async function cleanupDatasets() {
const options = {
filter: 'id:securitycenter_*',
};
const [datasets] = await bigquery.getDatasets(options);
for (const dataset of datasets) {
console.log(`Deleting dataset: ${dataset.id}`);
await bigquery.dataset(dataset.id).delete({force: true});
}
}

Comment on lines +51 to +72
async function createDataset() {
const randomSuffix = Math.floor(Date.now() / 1000);
const datasetId = `securitycenter_dataset_${randomSuffix}`;
const options = {
location: 'US',
};

try {
const [createdDataset] = await bigquery.createDataset(datasetId, options);
console.log(`Dataset ${createdDataset.id} created.`);
return createdDataset.id;
} catch (error) {
if (error.code === 409) {
// Dataset already exists - Fail the test instead of moving on
console.log(
`Dataset ${datasetId} already exists. Exiting to avoid conflict.`
);
throw new Error(`Dataset ${datasetId} already exists.`);
}
throw error;
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The error handling in createDataset could be improved. Instead of throwing an error if the dataset already exists, consider returning the existing dataset ID. This would avoid unnecessary test failures and allow the tests to continue running.

Suggested change
async function createDataset() {
const randomSuffix = Math.floor(Date.now() / 1000);
const datasetId = `securitycenter_dataset_${randomSuffix}`;
const options = {
location: 'US',
};
try {
const [createdDataset] = await bigquery.createDataset(datasetId, options);
console.log(`Dataset ${createdDataset.id} created.`);
return createdDataset.id;
} catch (error) {
if (error.code === 409) {
// Dataset already exists - Fail the test instead of moving on
console.log(
`Dataset ${datasetId} already exists. Exiting to avoid conflict.`
);
throw new Error(`Dataset ${datasetId} already exists.`);
}
throw error;
}
}
async function createDataset() {
const randomSuffix = Math.floor(Date.now() / 1000);
const datasetId = `securitycenter_dataset_${randomSuffix}`;
const options = {
location: 'US',
};
try {
const [createdDataset] = await bigquery.createDataset(datasetId, options);
console.log(`Dataset ${createdDataset.id} created.`);
return createdDataset.id;
} catch (error) {
if (error.code === 409) {
// Dataset already exists - return the existing dataset ID
console.log(`Dataset ${datasetId} already exists. Using existing dataset.`);
const [datasets] = await bigquery.getDatasets({ filter: `id:${datasetId}` });
return datasets[0].id;
}
throw error;
}
}

Comment on lines +89 to +90
const bigQueryExportId =
'bigqueryexportid-' + Math.floor(Math.random() * 10000);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The bigQueryExportId could be generated using a more robust method, such as a UUID library, to ensure uniqueness and avoid potential collisions.

Suggested change
const bigQueryExportId =
'bigqueryexportid-' + Math.floor(Math.random() * 10000);
const { v4: uuidv4 } = require('uuid');
const bigQueryExportId = uuidv4();

Comment on lines +29 to +31
// format is "organizations/[organization_id]/locations/[location_id]",
// "folders/[folder_id]/locations/[location_id]", or
// "projects/[project_id]/locations/[location_id]".
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The comment here could be more concise and descriptive. For example:

// The parent resource for the new BigQuery export.
// Its format can be:
// - organizations/{organization}/locations/{location}
// - folders/{folder}/locations/{location}
// - projects/{project}/locations/{location}
Suggested change
// format is "organizations/[organization_id]/locations/[location_id]",
// "folders/[folder_id]/locations/[location_id]", or
// "projects/[project_id]/locations/[location_id]".
// The parent resource for the new BigQuery export.
// Its format can be:
// - organizations/{organization}/locations/{location}
// - folders/{folder}/locations/{location}
// - projects/{project}/locations/{location}

Comment on lines +46 to +49
// Required. Unique identifier provided by the client within the parent scope.
// It must consist of only lowercase letters, numbers, and hyphens, must start
// with a letter, must end with either a letter or a number, and must be 63
// characters or less.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Instead of generating a random ID, consider using a UUID library to ensure uniqueness and avoid potential collisions.

Suggested change
// Required. Unique identifier provided by the client within the parent scope.
// It must consist of only lowercase letters, numbers, and hyphens, must start
// with a letter, must end with either a letter or a number, and must be 63
// characters or less.
const { v4: uuidv4 } = require('uuid');
const bigQueryExportId = uuidv4();

/*
* TODO(developer): Update the following references for your own environment before running the sample.
*/
// TODO(developer): Update the following references for your own environment before running the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The TODO comment should be removed after the references have been updated. It's best to avoid leaving TODOs in production code.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// Build the full resource path for the BigQuery export to delete.

/*
* TODO(developer): Update the following references for your own environment before running the sample.
*/
// TODO(developer): Update the following references for your own environment before running the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The TODO comment should be removed after updating the references. It's good practice to avoid leaving TODOs in production code.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// Build the full resource path for the BigQuery export to retrieve.

Comment on lines +29 to 30
// TODO(developer): Update the following references for your own environment before running the sample.
// const organizationId = 'YOUR_ORGANIZATION_ID';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

This TODO comment should be removed after the references have been updated. It's best to avoid leaving TODOs in production code.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// const organizationId = 'YOUR_ORGANIZATION_ID';
// const organizationId = 'YOUR_ORGANIZATION_ID';
// const location = 'LOCATION_ID';

* TODO(developer): Update the following references for your own environment before running the sample.
*/
// Required. Name of the BigQuery export to retrieve.
// TODO(developer): Update the following references for your own environment before running the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The TODO comment should be removed after updating the references. It's best to avoid leaving TODOs in production code.

Suggested change
// TODO(developer): Update the following references for your own environment before running the sample.
// Required. Name of the BigQuery export to retrieve.

@vijaykanthm
Copy link
Contributor

The changes are already implemented in this PR 3847.

@product-auto-label product-auto-label bot added api: bigquery Issues related to the BigQuery API. and removed api: securitycenter Issues related to the Security Command Center API. api: workflows Issues related to the Workflows API. labels Nov 18, 2024
@iennae iennae changed the title feat(bigquery): add v2 samples feat(security-center): add v2 samples Nov 18, 2024
@subfuzion
Copy link
Contributor Author

Closing (obsoleted by #3847).

@subfuzion subfuzion closed this Nov 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

api: bigquery Issues related to the BigQuery API. samples Issues that are directly related to samples.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants