-
Notifications
You must be signed in to change notification settings - Fork 2k
feat(security-center): add v2 samples #3919
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…dejs-docs-samples into bigquery-samples
subfuzion
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#3823 already approved.
|
@gemini review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @subfuzion, I'm currently reviewing your pull request and will post my detailed review in a few minutes. In the meantime, here's a quick summary for you and other reviewers to get up to speed:
This PR adds BigQuery v2 samples to the nodejs-docs-samples repository. The changes include adding a new test file (security-center/snippets/system-test/v2/bigqueryexport.test.js) with comprehensive tests for creating, listing, getting, updating, and deleting BigQuery exports. Several .js files within the security-center/snippets/v2 directory were modified to reflect the v2 API changes, and the workflows.json file was updated to include the new samples in the build process. The intent is to provide complete and up-to-date examples for using the BigQuery v2 API with Node.js.
While the PR description is brief, the code changes clearly indicate the addition of comprehensive BigQuery v2 functionality. If there's any missing context, please clarify.
And now, a little haiku to lighten the mood:
Code flows like a stream,
Tests run, green lights shine so bright,
Bugs flee in the night.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This pull request adds system tests for the v2 BigQuery export functionality in the security-center. The changes include adding a new system test file, updating the workflows file to include the new tests, and modifying several snippet files to remove unnecessary comments and improve clarity. Overall, the changes are well-structured and address a key area of testing. However, there are a few minor suggestions for improvement.
One suggestion is to consider adding a test to verify the data exported to BigQuery. Currently, the tests check for successful creation, listing, getting, updating, and deletion of the BigQuery export, but they don't validate the content of the exported data. This could be a valuable addition to ensure data integrity.
Another suggestion is to revisit the error handling in the createDataset function. While the function currently handles the case where a dataset with the same name already exists, it might be beneficial to implement a retry mechanism with exponential backoff to avoid potential conflicts and ensure the tests run reliably.
Finally, it's recommended to add a brief description of the v2 BigQuery export functionality to the README file. This will help users understand the purpose of the new tests and how to use the functionality.
| "workflows", | ||
| "workflows/invoke-private-endpoint" | ||
| "workflows/invoke-private-endpoint", | ||
| "security-center/snippets/v2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| if (error.code === 409) { | ||
| // Dataset already exists - Fail the test instead of moving on | ||
| console.log( | ||
| `Dataset ${datasetId} already exists. Exiting to avoid conflict.` | ||
| ); | ||
| throw new Error(`Dataset ${datasetId} already exists.`); | ||
| } | ||
| throw error; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handling here could be improved. Instead of failing the test if the dataset already exists, consider implementing a retry mechanism with exponential backoff. This will make the tests more robust and less likely to fail due to transient errors.
| if (error.code === 409) { | |
| // Dataset already exists - Fail the test instead of moving on | |
| console.log( | |
| `Dataset ${datasetId} already exists. Exiting to avoid conflict.` | |
| ); | |
| throw new Error(`Dataset ${datasetId} already exists.`); | |
| } | |
| throw error; | |
| } | |
| if (error.code === 409) { | |
| // Dataset already exists - retry with a new random suffix after a delay | |
| console.log(`Dataset ${datasetId} already exists. Retrying...`); | |
| await new Promise(resolve => setTimeout(resolve, 2000)); // Wait 2 seconds | |
| return createDataset(); | |
| } | |
| throw error; |
| const client = new SecurityCenterClient(); | ||
|
|
||
| // Clean up any existing datasets or BigQuery exports | ||
| await cleanupDatasets(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| } catch (error) { | ||
| console.error('Error creating BigQuery export:', error); | ||
| } | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's good practice to clean up created resources after the tests are completed. Consider adding an after hook to delete the BigQuery export and dataset.
| }); | |
| }); | |
| after(async () => { | |
| const client = new SecurityCenterClient(); | |
| try { | |
| await client.deleteBigQueryExport({name: data.bigQueryExportName}); | |
| await bigquery.dataset(dataset.split('/').pop()).delete({force: true}); | |
| } catch (error) { | |
| console.error("Error cleaning up resources:", error); | |
| } | |
| }); |
| // format is "organizations/[organization_id]/locations/[location_id]", | ||
| // "folders/[folder_id]/locations/[location_id]", or | ||
| // "projects/[project_id]/locations/[location_id]". |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While the comments are helpful, they could be more concise. Consider removing the redundant information about the format of the parent resource, as it's already implied by the function call.
| // format is "organizations/[organization_id]/locations/[location_id]", | |
| // "folders/[folder_id]/locations/[location_id]", or | |
| // "projects/[project_id]/locations/[location_id]". | |
| // The parent resource of the new BigQuery export. |
| // Required. Unique identifier provided by the client within the parent scope. | ||
| // It must consist of only lowercase letters, numbers, and hyphens, must start | ||
| // with a letter, must end with either a letter or a number, and must be 63 | ||
| // characters or less. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment explaining the requirements for the BigQuery export ID is quite verbose. Consider shortening it to be more concise.
| // Required. Unique identifier provided by the client within the parent scope. | |
| // It must consist of only lowercase letters, numbers, and hyphens, must start | |
| // with a letter, must end with either a letter or a number, and must be 63 | |
| // characters or less. | |
| // Unique identifier for the BigQuery export. |
| /* | ||
| * TODO(developer): Update the following references for your own environment before running the sample. | ||
| */ | ||
| // TODO(developer): Update the following references for your own environment before running the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TODO comment here seems to be leftover from a template. Since the organizationId and exportId are now being passed as arguments to the function, this TODO is no longer relevant and can be removed.
| // TODO(developer): Update the following references for your own environment before running the sample. | |
| // Build the full resource path for the BigQuery export to delete. |
| /* | ||
| * TODO(developer): Update the following references for your own environment before running the sample. | ||
| */ | ||
| // TODO(developer): Update the following references for your own environment before running the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| // TODO(developer): Update the following references for your own environment before running the sample. | ||
| // const organizationId = 'YOUR_ORGANIZATION_ID'; | ||
| // const location = 'LOCATION_ID'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This TODO comment is also redundant and can be removed.
| // TODO(developer): Update the following references for your own environment before running the sample. | |
| // const organizationId = 'YOUR_ORGANIZATION_ID'; | |
| // const location = 'LOCATION_ID'; | |
| // The parent resource for listing BigQuery exports. |
| * TODO(developer): Update the following references for your own environment before running the sample. | ||
| */ | ||
| // Required. Name of the BigQuery export to retrieve. | ||
| // TODO(developer): Update the following references for your own environment before running the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This pull request adds system tests for the v2 BigQuery export functionality in the security-center. The changes include a new test file, additions to the workflows file, and modifications to several snippet files to remove outdated comments and improve clarity. The implementation looks good overall, but I have a few suggestions for improvement.
It's good practice to add corresponding unit tests alongside system tests to ensure comprehensive coverage. Consider adding unit tests for the BigQuery export functionality to complement the system tests introduced in this PR.
The README file should be updated to reflect the new v2 BigQuery export functionality and how to use it. This will help users understand the feature and its usage.
Consider adding a section to the README about cleaning up resources after running the system tests to avoid accumulating unnecessary costs.
| "workflows", | ||
| "workflows/invoke-private-endpoint" | ||
| "workflows/invoke-private-endpoint", | ||
| "security-center/snippets/v2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| async function cleanupDatasets() { | ||
| const [datasets] = await bigquery.getDatasets(); | ||
| for (const dataset of datasets) { | ||
| if (dataset.id.startsWith('securitycenter_')) { | ||
| console.log(`Deleting dataset: ${dataset.id}`); | ||
| await bigquery.dataset(dataset.id).delete({force: true}); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The cleanupDatasets function could be more efficient by using a filter when listing datasets. This would avoid iterating through all datasets and improve performance, especially if there are many datasets in the project.
| async function cleanupDatasets() { | |
| const [datasets] = await bigquery.getDatasets(); | |
| for (const dataset of datasets) { | |
| if (dataset.id.startsWith('securitycenter_')) { | |
| console.log(`Deleting dataset: ${dataset.id}`); | |
| await bigquery.dataset(dataset.id).delete({force: true}); | |
| } | |
| } | |
| async function cleanupDatasets() { | |
| const options = { | |
| filter: 'id:securitycenter_*', | |
| }; | |
| const [datasets] = await bigquery.getDatasets(options); | |
| for (const dataset of datasets) { | |
| console.log(`Deleting dataset: ${dataset.id}`); | |
| await bigquery.dataset(dataset.id).delete({force: true}); | |
| } | |
| } |
| async function createDataset() { | ||
| const randomSuffix = Math.floor(Date.now() / 1000); | ||
| const datasetId = `securitycenter_dataset_${randomSuffix}`; | ||
| const options = { | ||
| location: 'US', | ||
| }; | ||
|
|
||
| try { | ||
| const [createdDataset] = await bigquery.createDataset(datasetId, options); | ||
| console.log(`Dataset ${createdDataset.id} created.`); | ||
| return createdDataset.id; | ||
| } catch (error) { | ||
| if (error.code === 409) { | ||
| // Dataset already exists - Fail the test instead of moving on | ||
| console.log( | ||
| `Dataset ${datasetId} already exists. Exiting to avoid conflict.` | ||
| ); | ||
| throw new Error(`Dataset ${datasetId} already exists.`); | ||
| } | ||
| throw error; | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handling in createDataset could be improved. Instead of throwing an error if the dataset already exists, consider returning the existing dataset ID. This would avoid unnecessary test failures and allow the tests to continue running.
| async function createDataset() { | |
| const randomSuffix = Math.floor(Date.now() / 1000); | |
| const datasetId = `securitycenter_dataset_${randomSuffix}`; | |
| const options = { | |
| location: 'US', | |
| }; | |
| try { | |
| const [createdDataset] = await bigquery.createDataset(datasetId, options); | |
| console.log(`Dataset ${createdDataset.id} created.`); | |
| return createdDataset.id; | |
| } catch (error) { | |
| if (error.code === 409) { | |
| // Dataset already exists - Fail the test instead of moving on | |
| console.log( | |
| `Dataset ${datasetId} already exists. Exiting to avoid conflict.` | |
| ); | |
| throw new Error(`Dataset ${datasetId} already exists.`); | |
| } | |
| throw error; | |
| } | |
| } | |
| async function createDataset() { | |
| const randomSuffix = Math.floor(Date.now() / 1000); | |
| const datasetId = `securitycenter_dataset_${randomSuffix}`; | |
| const options = { | |
| location: 'US', | |
| }; | |
| try { | |
| const [createdDataset] = await bigquery.createDataset(datasetId, options); | |
| console.log(`Dataset ${createdDataset.id} created.`); | |
| return createdDataset.id; | |
| } catch (error) { | |
| if (error.code === 409) { | |
| // Dataset already exists - return the existing dataset ID | |
| console.log(`Dataset ${datasetId} already exists. Using existing dataset.`); | |
| const [datasets] = await bigquery.getDatasets({ filter: `id:${datasetId}` }); | |
| return datasets[0].id; | |
| } | |
| throw error; | |
| } | |
| } |
| const bigQueryExportId = | ||
| 'bigqueryexportid-' + Math.floor(Math.random() * 10000); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The bigQueryExportId could be generated using a more robust method, such as a UUID library, to ensure uniqueness and avoid potential collisions.
| const bigQueryExportId = | |
| 'bigqueryexportid-' + Math.floor(Math.random() * 10000); | |
| const { v4: uuidv4 } = require('uuid'); | |
| const bigQueryExportId = uuidv4(); |
| // format is "organizations/[organization_id]/locations/[location_id]", | ||
| // "folders/[folder_id]/locations/[location_id]", or | ||
| // "projects/[project_id]/locations/[location_id]". |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment here could be more concise and descriptive. For example:
// The parent resource for the new BigQuery export.
// Its format can be:
// - organizations/{organization}/locations/{location}
// - folders/{folder}/locations/{location}
// - projects/{project}/locations/{location}| // format is "organizations/[organization_id]/locations/[location_id]", | |
| // "folders/[folder_id]/locations/[location_id]", or | |
| // "projects/[project_id]/locations/[location_id]". | |
| // The parent resource for the new BigQuery export. | |
| // Its format can be: | |
| // - organizations/{organization}/locations/{location} | |
| // - folders/{folder}/locations/{location} | |
| // - projects/{project}/locations/{location} |
| // Required. Unique identifier provided by the client within the parent scope. | ||
| // It must consist of only lowercase letters, numbers, and hyphens, must start | ||
| // with a letter, must end with either a letter or a number, and must be 63 | ||
| // characters or less. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of generating a random ID, consider using a UUID library to ensure uniqueness and avoid potential collisions.
| // Required. Unique identifier provided by the client within the parent scope. | |
| // It must consist of only lowercase letters, numbers, and hyphens, must start | |
| // with a letter, must end with either a letter or a number, and must be 63 | |
| // characters or less. | |
| const { v4: uuidv4 } = require('uuid'); | |
| const bigQueryExportId = uuidv4(); |
| /* | ||
| * TODO(developer): Update the following references for your own environment before running the sample. | ||
| */ | ||
| // TODO(developer): Update the following references for your own environment before running the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TODO comment should be removed after the references have been updated. It's best to avoid leaving TODOs in production code.
| // TODO(developer): Update the following references for your own environment before running the sample. | |
| // Build the full resource path for the BigQuery export to delete. |
| /* | ||
| * TODO(developer): Update the following references for your own environment before running the sample. | ||
| */ | ||
| // TODO(developer): Update the following references for your own environment before running the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TODO comment should be removed after updating the references. It's good practice to avoid leaving TODOs in production code.
| // TODO(developer): Update the following references for your own environment before running the sample. | |
| // Build the full resource path for the BigQuery export to retrieve. |
| // TODO(developer): Update the following references for your own environment before running the sample. | ||
| // const organizationId = 'YOUR_ORGANIZATION_ID'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This TODO comment should be removed after the references have been updated. It's best to avoid leaving TODOs in production code.
| // TODO(developer): Update the following references for your own environment before running the sample. | |
| // const organizationId = 'YOUR_ORGANIZATION_ID'; | |
| // const organizationId = 'YOUR_ORGANIZATION_ID'; | |
| // const location = 'LOCATION_ID'; |
| * TODO(developer): Update the following references for your own environment before running the sample. | ||
| */ | ||
| // Required. Name of the BigQuery export to retrieve. | ||
| // TODO(developer): Update the following references for your own environment before running the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
The changes are already implemented in this PR 3847. |
|
Closing (obsoleted by #3847). |
This is PR #3823, but using a local branch (required to run tests) instead of a fork.