Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: update of files from global .github repo #3695

Open
wants to merge 34 commits into
base: master
Choose a base branch
from

Conversation

asyncapi-bot
Copy link
Contributor

@asyncapi-bot asyncapi-bot commented Feb 17, 2025

Summary by CodeRabbit

  • New Features

    • Introduced a new command /transfer-issue {repo-name} or /ti {repo-name} for transferring issues between repositories.
    • Added a new workflow to automate the transfer of issues between repositories based on specific conditions.
  • Chores

    • Improved the automated merging workflow by refining how merge commit messages are generated, ensuring the process accurately interprets structured data for enhanced logging precision.

Copy link
Contributor

coderabbitai bot commented Feb 17, 2025

Walkthrough

This change updates multiple GitHub Actions workflows. It modifies the MERGE_COMMIT_MESSAGE environment variable in the automerge-for-humans job to use the fromJSON function for parsing the output of the create-commit-message step. Additionally, a new command for transferring issues between repositories is introduced in the help command workflow, and a new workflow file is added to automate the issue transfer process.

Changes

File Path Change Summary
.github/workflows/automerge-for-humans-merging.yml Modified MERGE_COMMIT_MESSAGE to use fromJSON(steps.create-commit-message.outputs.result) instead of a direct string output, thereby parsing the output as JSON.
.github/workflows/help-command.yml Added new command /transfer-issue {repo-name} or /ti {repo-name} for transferring issues, with example usage included.
.github/workflows/transfer-issue.yml Introduced a new workflow to automate transferring issues between repositories, triggered by issue comments, with checks for repository validity and execution of the transfer command.

Possibly related PRs

  • docs(generator): update latest generator documentation #3863: The changes in the main PR regarding the correction of a syntax error in the createAsyncapiFile function are directly related to the modifications made in the retrieved PR, which also addresses a syntax error in the same function within the generator documentation. Both PRs focus on ensuring the correct implementation of the createAsyncapiFile function.
  • chore: update slack invite link #3443: The changes in the main PR regarding the update of the Slack invite link in the _redirects file are directly related to the changes in the retrieved PR, which also involves updating the Slack invite link in the same file. Both PRs modify the same redirection rule for the /slack-invite path.
  • fix: pr testing workflow for bot PRs #3841: The changes in the main PR regarding the conditional checks for the "Install dependencies" step in the workflow are directly related to the modifications made in the retrieved PR, which also focuses on enhancing the workflow by implementing similar conditional checks based on the existence of the package.json file.

Suggested reviewers

  • akshatnema

Poem

I’m a rabbit bouncing through pipelines bright,
Hopping over code with a joyful insight.
From strings to JSON, I’ve made my quick leap,
Parsing these changes from the data so deep.
With whiskers a-twitch and ears standing tall,
I celebrate this merge—hoot, hop, and all!
🐰✨

✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

netlify bot commented Feb 17, 2025

Deploy Preview for asyncapi-website ready!

Name Link
🔨 Latest commit d6344d3
🔍 Latest deploy log https://app.netlify.com/sites/asyncapi-website/deploys/67d0d883adb34300085e4cd8
😎 Deploy Preview https://deploy-preview-3695--asyncapi-website.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
.github/workflows/automerge-for-humans-merging.yml (1)

92-92: Ensure Valid JSON Processing for Commit Message

The modification correctly switches to using fromJSON with the steps.create-commit-message.outputs.result output. Please ensure that this output is always a valid JSON string so that the commit message is parsed correctly. If the string isn’t properly JSON-formatted, the workflow could fail or produce unintended results.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 7376ef8 and 32b9a8d.

📒 Files selected for processing (1)
  • .github/workflows/automerge-for-humans-merging.yml (1 hunks)
⏰ Context from checks skipped due to timeout of 180000ms (4)
  • GitHub Check: Redirect rules - asyncapi-website
  • GitHub Check: Header rules - asyncapi-website
  • GitHub Check: Pages changed - asyncapi-website
  • GitHub Check: Lighthouse CI

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
.github/workflows/transfer-issue.yml (3)

13-13: Simplify Complex Job Condition
The conditional expression for the job is rather complex:

if: ${{ (!github.event.issue.pull_request && github.event.issue.state != 'closed' && github.actor != 'asyncapi-bot') && (startsWith(github.event.comment.body, '/transfer-issue') || startsWith(github.event.comment.body, '/ti')) }}

Consider refactoring or breaking it into multiple logical parts for better readability and maintainability.


38-47: Enhance Error Handling in Repository Check
Within the "Check Repo" step’s catch block, the code creates a comment and then calls process.exit(1) without logging detailed error information. Consider logging the caught error or adding additional context so that debugging failures in repository verification becomes easier.


57-57: Remove Trailing Spaces
There are trailing spaces detected on line 57. Removing these will help you comply with YAML lint standards and improve readability.

🧰 Tools
🪛 YAMLlint (1.35.1)

[error] 57-57: trailing spaces

(trailing-spaces)

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 32b9a8d and 1ed4fa5.

📒 Files selected for processing (2)
  • .github/workflows/help-command.yml (1 hunks)
  • .github/workflows/transfer-issue.yml (1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.4)
.github/workflows/transfer-issue.yml

20-20: "github.event.comment.body" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/security-guides/security-hardening-for-github-actions for more details

(expression)

🪛 YAMLlint (1.35.1)
.github/workflows/transfer-issue.yml

[error] 57-57: trailing spaces

(trailing-spaces)

⏰ Context from checks skipped due to timeout of 180000ms (1)
  • GitHub Check: Automerge PR autoapproved by a bot
🔇 Additional comments (4)
.github/workflows/help-command.yml (2)

1-3: Header Notice on Central Management
The header correctly reminds readers that this workflow is maintained in the global repository. Please ensure any future modifications are done in the upstream repo rather than locally.


61-62: New Transfer Issue Command Description
The new command documentation for /transfer-issue {repo-name} (and its alias /ti {repo-name}) is clear and includes helpful examples. Verify that these examples exactly reflect the workflow behavior in the corresponding transfer process.

.github/workflows/transfer-issue.yml (2)

1-3: Header Notice on Central Management
The header informs users that this file is centrally managed. Ensure any changes propagate from the central repository to avoid divergence.


50-57: Verify Transfer Issue Command Execution
The "Transfer Issue" step uses the command:

gh issue transfer ${{github.event.issue.number}} asyncapi/${{steps.extract_step.outputs.repo}}

Please ensure that the GitHub CLI is pre-installed in the runner environment and that any failure of this command is properly captured and logged for troubleshooting.

🧰 Tools
🪛 YAMLlint (1.35.1)

[error] 57-57: trailing spaces

(trailing-spaces)

Comment on lines +21 to +23
COMMENT="${{github.event.comment.body}}"
REPO=$(echo $COMMENT | awk '{print $2}')
echo repo=$REPO >> $GITHUB_OUTPUT
Copy link
Contributor

@coderabbitai coderabbitai bot Feb 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Security Improvement: Handle Untrusted Input Safely
The "Extract Input" step uses the untrusted github.event.comment.body directly in the shell script:

COMMENT="${{github.event.comment.body}}"
REPO=$(echo $COMMENT | awk '{print $2}')
echo repo=$REPO >> $GITHUB_OUTPUT

This can risk command injection. Instead, pass github.event.comment.body as an environment variable using the env: field (e.g., COMMENT: ${{ github.event.comment.body }}) and then reference $COMMENT in the script. This approach complies with GitHub’s security-hardening recommendations.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@derberg @thulieblack Have we looked into this issue seriously?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like the humans are having a chat. I'll hop back into my burrow for now. If you need me again, just tag @coderabbitai in a new comment, and I'll come hopping out!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Souvikns hey man, can you have a look - seems to be a valid hint from coderabbit

@sambhavgupta0705
Copy link
Member

@akshatnema @anshgoyalevil can you please have a look on this

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments. If you are seeing this consistently it is likely a permissions issue. Please check "Moderation" -> "Code review limits" under your organization settings.

Actionable comments posted: 10

🔭 Outside diff range comments (2)
dashboard.json (1)

236-237: ⚠️ Potential issue

Resolve duplicate "title" keys in the same object.

JSON only retains the last key’s value, causing confusion or potential parsing errors. Remove one of the duplicate keys to ensure the entry is correct.

{
  "id": "I_kwDOCoBobc6nflHk",
- "title": "[BUG] Missing server.summary() function",
  "title": "fix: Update code to support @octokit/request v9 and @octokit/graphql v8",
  "isAssigned": false,
  ...
}
config/tools-automated.json (1)

53-85: 💡 Verification agent

🧩 Analysis chain

Multiple Filters Blocks in Zod Sockets Tool Entry
The "Zod Sockets" tool entry defines two separate filters objects—one specifying
"language": "TypeScript", with technology [ "Node.js", "TypeScript" ] and categories [ "code-first", "dsl", "framework" ], and another specifying
"language": "Python", with technology [ "Socket.IO", "Flask" ] and categories [ "code-first", "api" ].
Having multiple filters keys in the same object is not valid in JSON and creates ambiguity. Please merge these conditions into one filters object or confirm which set is correct.


I'll start by verifying the JSON structure for the "Zod Sockets" tool entry to check whether there are duplicate filters keys. The following shell script will search for the block containing "title": "Zod Sockets" and display its context:

Let's run this and review the output before finalizing the rewritten comment.


🏁 Script executed:

#!/bin/bash
rg -C 15 '"title": "Zod Sockets"' config/tools-automated.json

Length of output: 5055


Critical Issue: Duplicate JSON Key in "Zod Sockets" Filters

Our verification shows that one of the "Zod Sockets" tool entries in config/tools-automated.json contains duplicate "categories" keys within its filters object. Since JSON does not allow duplicate keys, only the last occurrence is reliably processed—which can lead to ambiguity or unintended behavior.

  • Action Required: Merge the values from both "categories" keys into a single array or confirm which set of values should be retained.
  • Location: In at least one "Zod Sockets" entry under the filters object, as evidenced by the duplicate keys in the output.

Please address this to ensure the JSON file is valid and the intended filters are correctly applied.

🧹 Nitpick comments (45)
components/navigation/MobileNavMenu.tsx (2)

118-119: Consider using Next.js Link component for consistency.

For consistency with the other navigation items, consider replacing this <a> tag with a Next.js <Link> component if this is navigating to an internal page. If this is not a navigation link but just used for toggling the dropdown, consider using a <button> element instead for better accessibility.


141-143: Consider using Next.js Link component for consistency.

Similarly, for consistency with the updated navigation items, consider using a Next.js <Link> component here if this is navigating to an internal page. If this is only used for toggling the language dropdown, a <button> element would be more appropriate for accessibility.

types/packages/node-fetch-2.d.ts (1)

1-3: Declaration looks correct but may need additional type imports.

The TypeScript declaration for the node-fetch-2 module is syntactically correct. However, it references RequestInit and Response types without importing them. If this code is used in a browser context where these DOM types are available by default, this should work fine. If used in Node.js, you might need to either:

  1. Add reference to DOM lib in your tsconfig.json:

    {
      "compilerOptions": {
        "lib": ["es2015", "dom"]
      }
    }
  2. Or explicitly import these types:

    declare module 'node-fetch-2' {
      import { RequestInit, Response } from 'node-fetch';
      export default function fetch(url: string, options?: RequestInit): Promise<Response>;
    }
types/scripts/tools.ts (3)

48-49: Consider updating comment to reflect current implementation.

The comment "Instead of extending BaseFilters, create a separate interface" indicates a change in implementation approach, but there's no BaseFilters interface defined in the file.

-// Instead of extending BaseFilters, create a separate interface
+// FinalFilters provides a complete, non-optional version of the Filters interface

57-61: Consider exporting non-exported types that might be used elsewhere.

The types BaseAsyncAPITool, Repository, ToolItem, and ToolsList are not exported. If these types are intended to be used outside this file, consider exporting them.

-type BaseAsyncAPITool = {
+export type BaseAsyncAPITool = {
  title: string;
  description?: string;
  links?: Links;
};

// Repository and tools data types
-type Repository = {
+export type Repository = {
  full_name: string;
  html_url: string;
  owner: {
    login: string;
  };
  description: string;
};

-type ToolItem = {
+export type ToolItem = {
  name: string;
  url: string;
  path: string;
  html_url: string;
  repository: Repository;
};

// Tools list types
-type ToolsList<T> = {
+export type ToolsList<T> = {
  [key: string]: {
    description: string;
    toolsList: T[];
  };
};

Also applies to: 73-80, 82-88, 95-100


49-54: Consider more descriptive naming for "Final" prefixed interfaces.

The naming convention with "Final" prefix for FinalFilters and FinalAsyncAPITool could be more descriptive to indicate their purpose, such as ResolvedFilters or ProcessedFilters to better reflect that these are complete or processed versions of their counterparts.

-export interface FinalFilters {
+export interface ResolvedFilters {
  language: LanguageColorItem[];
  technology: LanguageColorItem[];
  categories: Array<Category>;
  hasCommercial: boolean;
}

-export interface FinalAsyncAPITool extends BaseAsyncAPITool {
+export interface ResolvedAsyncAPITool extends BaseAsyncAPITool {
  description: string; // Make required in final
-  filters: FinalFilters;
+  filters: ResolvedFilters;
}

-export type FinalToolsListObject = ToolsList<FinalAsyncAPITool>;
+export type ResolvedToolsListObject = ToolsList<ResolvedAsyncAPITool>;

Also applies to: 67-70

markdown/docs/tools/generator/usage.md (1)

11-14: Refined CLI Command Heading
The new concise command signature (### asyncapi generate fromTemplate ASYNCAPI TEMPLATE) effectively streamlines the usage information and directs users to the official AsyncAPI CLI documentation for complete details. This simplification minimizes duplication and maintenance overhead while still providing a clear introduction.

markdown/docs/tools/generator/versioning.md (1)

21-25: Command Update and Clarification Note Enhancement
The updated command line:

asyncapi generate fromTemplate asyncapi.yaml @asyncapi/[email protected] --use-new-generator

explicitly specifies the template version and includes the necessary flag for the latest generator. The newly added note further clarifies the importance of the flag when multiple generator versions exist. One minor nitpick: consider inserting a comma after "For more details" in the note for improved readability (i.e. "For more details, you can also check...").

🧰 Tools
🪛 LanguageTool

[uncategorized] ~24-~24: Possible missing comma found.
Context: ...he --use-new-generator flag. For more details you can also check [asyncapi generate f...

(AI_HYDRA_LEO_MISSING_COMMA)

scripts/tools/tools-schema.json (3)

13-16: Addition of the Custom Description Property

The new "description" property provides a valuable override to the repository’s default description. Consider a slight rewording for clarity and grammar. For example, changing

- "description": "By default scripts read description of repository there project is stored. You can override this behaviour by providing custom description."
+ "description": "By default, scripts read the repository’s description where the project is stored. You can override this behavior by providing a custom description."

improves readability.


43-110: Complexity in the Language Property Schema

The "language" property now employs a nested anyOf construct to support both a fixed enumeration and free-form strings, for both singular and array values. Although functionally correct, the schema could be simplified by using oneOf instead of nesting anyOf rules. For example:

-          "anyOf": [
-            {
-              "type": "string",
-              "anyOf": [
-                {
-                  "type": "string",
-                  "enum": [ "Go", "Java", "JavaScript", "HTML", "C/C++", "C#", "Python", "TypeScript", "Kotlin", "Scala", "Markdown", "YAML", "R", "Ruby", "Rust", "Shell", "Groovy" ]
-                },
-                {
-                  "type": "string"
-                }
-              ]
-            },
-            {
-              "type": "array",
-              "items": {
-                "type": "string",
-                "anyOf": [
-                  {
-                    "type": "string",
-                    "enum": [ "Go", "Java", "JavaScript", "HTML", "C/C++", "C#", "Python", "TypeScript", "Kotlin", "Scala", "Markdown", "YAML", "R", "Ruby", "Rust", "Shell", "Groovy" ]
-                  },
-                  {
-                    "type": "string"
-                  }
-                ]
-              }
-            }
-          ]
+          "anyOf": [
+            {
+              "oneOf": [
+                {
+                  "type": "string",
+                  "enum": ["Go", "Java", "JavaScript", "HTML", "C/C++", "C#", "Python", "TypeScript", "Kotlin", "Scala", "Markdown", "YAML", "R", "Ruby", "Rust", "Shell", "Groovy"]
+                },
+                { "type": "string" }
+              ]
+            },
+            {
+              "type": "array",
+              "items": {
+                "oneOf": [
+                  {
+                    "type": "string",
+                    "enum": ["Go", "Java", "JavaScript", "HTML", "C/C++", "C#", "Python", "TypeScript", "Kotlin", "Scala", "Markdown", "YAML", "R", "Ruby", "Rust", "Shell", "Groovy"]
+                  },
+                  { "type": "string" }
+                ]
+              }
+            }
+          ]

This refactor can improve readability and maintainability.


145-200: Refinement Needed for the Categories Description

The "categories" property has a descriptive text that could benefit from improved grammar and clarity. For example, you might refine it as follows:

- "description": "Categories are used to group tools by different use case, like documentation or code generation. If have a list of fixed categories. If you use different one that your tool lands under \"other\" category. Feel free to add your category through a pull request to AsyncAPI website repository."
+ "description": "Categories are used to group tools by their use case, such as documentation or code generation. If your tool does not fit any of the fixed categories, use the 'other' category. You can also propose a new category via a pull request to the AsyncAPI website repository."

This makes the intent clearer for the users.

markdown/blog/2025-february-summary.md (3)

1-16: YAML Front Matter Review

The YAML header is well-structured with clear metadata including title, date, type, tags, cover image, authors, excerpt, and featured status. Please double-check that the author’s photo URL (currently /img/avatars/thulieblack.webp) correctly corresponds to "Thulisile Sibanda" to avoid any potential naming discrepancies.


49-67: TSC Members Profiles Component

The TSC Members section uses a <Profiles> component with a property named profiles to render member details. Note that static analysis flagged a possible repetition of the word "profiles." If this naming is intentional to align with component conventions, it can be left as is; otherwise, consider adjusting the nomenclature for clarity.

🧰 Tools
🪛 LanguageTool

[duplication] ~51-~51: Possible typo: you repeated a word.
Context: ...api.com/community/tsc) (TSC) members. <Profiles profiles={[ { name: 'Aayush Saini', avat...

(ENGLISH_WORD_REPEAT_RULE)


69-70: Newsletter Content Tone

In the "Coming in March" section, the phrase “we have some amazing content you don't want to miss” could be refined for a more formal tone. Consider replacing "amazing" with alternatives such as "engaging" or "exciting" to better match the professional style of the update.

🧰 Tools
🪛 LanguageTool

[style] ~70-~70: Consider using a more formal and expressive alternative to ‘amazing’.
Context: ....asyncapi.com/newsletter); we have some amazing content you don't want to miss.

(AWESOME)

scripts/build-meetings.ts (1)

62-62: Use optional chaining for clarity.

Replace the logical AND with optional chaining to simplify this expression and avoid potential null reference errors.

-        banner: e.extendedProperties?.private && e.extendedProperties.private.BANNER,
+        banner: e.extendedProperties?.private?.BANNER,
🧰 Tools
🪛 Biome (1.9.4)

[error] 62-62: Change to an optional chain.

Unsafe fix: Change to an optional chain.

(lint/complexity/useOptionalChain)

tests/tools/extract-tools-github.test.js (3)

5-7: Inconsistent path references
You’re importing logger.ts on line 3, but mocking logger without the .ts extension here. Consider aligning them for clarity and consistency.


12-14: Global environment variable usage
Defining process.env.GITHUB_TOKEN globally in beforeAll can leak state between tests. Consider mocking or isolating environment variables per test if needed.


89-95: Avoid using delete on environment variables
Using delete process.env.GITHUB_TOKEN can degrade performance and trigger linter errors. Consider setting it to undefined:

- delete process.env.GITHUB_TOKEN;
+ process.env.GITHUB_TOKEN = undefined;
🧰 Tools
🪛 Biome (1.9.4)

[error] 90-90: Avoid the delete operator which can impact performance.

Unsafe fix: Use an undefined assignment instead.

(lint/performance/noDelete)

tests/readAndWriteJson.test.js (1)

12-12: Inconsistent path usage for utilities
You import ../scripts/utils.ts above, but mock ../scripts/utils here. Consider keeping the paths consistent.

scripts/compose.ts (1)

164-164: Error handling with logger
Catching and logging all errors is good practice. Consider whether you need a non-zero exit code in CI environments if a failure occurs.

Also applies to: 166-166, 168-168, 170-170

package.json (1)

119-120: Winston integration.
The addition of "winston": "^3.17.0" looks good, but confirm that the logger configuration is set up properly where Winston is intended to be used. Otherwise, you may want to remove it if it’s not being used.

scripts/markdown/check-edit-links.ts (1)

49-49: Preserve original error context.
When rejecting with a new Error, consider appending the original error object to maintain the full stack trace. For example:

- return Promise.reject(new Error(\`Error checking \${editLink}: \${error}\`));
+ return Promise.reject(new Error(\`Error checking \${editLink}:\n\${error.stack || error}\`));
scripts/markdown/check-markdown.ts (1)

16-25: Cautious URL validation.
isValidURL offers straightforward validation using the URL constructor. If you need to handle non-HTTP protocols or special patterns, consider an enhanced check or whitelisting approach.

scripts/adopters/index.ts (1)

12-14: Add error handling to buildAdoptersList function

The function currently lacks error handling for the writeJSON operation, which could fail for various reasons (file not found, permission issues, etc.).

 export async function buildAdoptersList() {
-  writeJSON('config/adopters.yml', resolve(currentDirPath, '../../config', 'adopters.json'));
+  try {
+    await writeJSON('config/adopters.yml', resolve(currentDirPath, '../../config', 'adopters.json'));
+    console.log('Successfully built adopters list');
+  } catch (error) {
+    console.error('Error building adopters list:', error);
+    throw error; // Re-throw to allow calling code to handle the error
+  }
 }

Also note that the writeJSON call should be awaited since it's likely an asynchronous operation.

scripts/tools/extract-tools-github.ts (1)

19-61: Remove unnecessary try/catch block

The current try/catch block only rethrows the original error without any additional handling, making it redundant.

-export async function getData(): Promise<ToolsData> {
-  // eslint-disable-next-line no-useless-catch
-  try {
-    if (!process.env.GITHUB_TOKEN) {
-      throw new Error('GITHUB_TOKEN environment variable is required');
-    }
+export async function getData(): Promise<ToolsData> {
+  if (!process.env.GITHUB_TOKEN) {
+    throw new Error('GITHUB_TOKEN environment variable is required');
+  }
    const allItems = [];
    let page = 1;

    const maxPerPage = 50;
    const getReqUrl = (PerPage: number, pageNo: number) =>
      `https://api.github.com/search/code?q=filename:.asyncapi-tool&per_page=${PerPage}&page=${pageNo}`;
    const headers = {
      accept: 'application/vnd.github.text-match+json',
      authorization: `token ${process.env.GITHUB_TOKEN}`
    };
    const result = await axios.get(getReqUrl(maxPerPage, page), {
      headers
    });
    const totalResults = result.data.total_count;

    allItems.push(...result.data.items);

    while (allItems.length < totalResults) {
      page++;

      logger.info(`Fetching page: ${page}`);
      // pause for 1 second to avoid rate limiting
      await pause(1000);
      const nextPageData = await axios.get(getReqUrl(maxPerPage, page), {
        headers
      });

      const { data } = nextPageData;

      allItems.push(...data.items);
    }

    result.data.items.push(...allItems);

    return result.data;
-  } catch (err) {
-    throw err;
-  }
}
🧰 Tools
🪛 Biome (1.9.4)

[error] 60-60: The catch clause that only rethrows the original error is useless.

An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.

(lint/complexity/noUselessCatch)

scripts/tools/tags-color.ts (1)

87-173: Consider using distinct colors for some technologies

I noticed that Jenkins (159-162), Flask (164-167), and Scala (149-152) all use the same color scheme (bg-[#D7C7F2] and border-[#A387D2]). This might make them visually indistinguishable in the UI. Consider assigning unique color schemes to each technology to enhance visual differentiation.

tests/build-meetings.test.js (1)

74-74: Consider alternative to delete operator

The delete operator can impact performance according to static analysis.

-    delete process.env.CALENDAR_SERVICE_ACCOUNT;
+    process.env.CALENDAR_SERVICE_ACCOUNT = undefined;
🧰 Tools
🪛 Biome (1.9.4)

[error] 74-74: Avoid the delete operator which can impact performance.

Unsafe fix: Use an undefined assignment instead.

(lint/performance/noDelete)

scripts/utils/readAndWriteJson.ts (1)

1-11: Consider including the file path in error messages and adding JSON formatting.

Currently, errors do not explicitly mention which file path caused the error, which may make debugging more difficult. Also, consider adding indentation to the output for better readability, e.g. JSON.stringify(jsonContent, null, 2).

-    throw new Error(`Error while reading file\nError: ${err}`);
+    throw new Error(
+      `Error reading file at path: "${readPath}". Original error: ${err}`
+    );
...
-    await writeFile(writePath, JSON.stringify(jsonContent));
+    await writeFile(writePath, JSON.stringify(jsonContent, null, 2));
scripts/build-tools.ts (2)

22-35: Validate combined tools output format.

While the code is correct in writing the automated tools JSON, ensure the combined tools (written by combineTools) are similarly structured or validated, to avoid subtle JSON formatting or schema mismatches.


37-48: Skip Istanbul ignore for CLI logic if feasible.

Since the CLI block is the main entry point, consider testing it using an integration test approach, instead of ignoring it in coverage. This improves overall test coverage consistency.

scripts/build-rss.ts (3)

24-35: Consider handling more HTML entities or using a specialized library.

The manual replacements in clean() handle common entities but might miss some or produce unexpected results in complex HTML. Evaluate using a dedicated library if robust HTML sanitization is needed.


89-93: Include validation logs before throwing errors.

When you detect invalid posts, consider logging them for easier debugging without relying solely on the thrown error message. For example, you can log the invalid data or relevant post fields for quick reference.


139-145: Ensure the output directory exists.

When writing the RSS XML to ./public/<outputPath>, confirm that the public directory is present or handle directory creation if not. This prevents potential “ENOENT” errors on systems without the folder.

tests/build-newsroom-videos.test.js (1)

103-109: Consider avoiding the delete operator.
Using delete process.env.YOUTUBE_TOKEN can degrade performance. Assigning process.env.YOUTUBE_TOKEN = undefined; is typically more performant and consistent across engines.

-    delete process.env.YOUTUBE_TOKEN;
+    process.env.YOUTUBE_TOKEN = undefined;
🧰 Tools
🪛 Biome (1.9.4)

[error] 104-104: Avoid the delete operator which can impact performance.

Unsafe fix: Use an undefined assignment instead.

(lint/performance/noDelete)

tests/tools/tools-object.test.js (1)

55-73: Typographical fix suggested in test description.
“should create a tool object one parameters” is slightly unclear. Consider revising to something like “should create a tool object with only the first parameter.”

-  it('should create a tool object one parameters', async () => {
+  it('should create a tool object when only the first parameter is provided', async () => {
scripts/tools/tools-object.ts (2)

1-24: Use consistent error handling for schema compilation and Ajv initialization.

Although the AJV setup and schema compilation here appear correct, consider adding error handling or logs if the schema fails to compile, to aid in diagnosing initialization issues (e.g., malformed JSON, etc.).


77-146: Validate concurrency behavior in the convertTools flow.

Using Promise.all on lines 93 and 120 is efficient and generally safe. However, be mindful of potential rate limits or large data sets. If the repository or category lists grow significantly, you may need congestion control or retry logic to handle partial failures gracefully.

tests/build-tools.test.js (2)

37-45: Confirm environment variable usage.

Setting process.env.GITHUB_TOKEN within the test ensures mock authentication. Confirm that no real secrets are accidentally exposed, and consider resetting or sanitizing relevant environment variables after the tests to prevent unwanted side effects.


77-85: Properly testing error handling.

The test adequately confirms that an axios fetch failure raises an error. Validate that partial tool data is still written if only one tool fails.

scripts/tools/combine-tools.ts (3)

1-24: Check for schema compilation errors.

Similar to other files, if the schema fails to compile, the process might be partially blocked. Consider logging any compilation issues at initialization to simplify debugging.


25-33: Initialize finalTools more dynamically.

Currently, finalTools is pre-populated from categoryList. If categories become more dynamic or user-defined, you may need to regenerate or merge categories at runtime instead.


142-166: Validate partial failures in processManualTool.

You correctly log validation failures without halting the entire build. Confirm that logs are monitored or captured in CI to detect the occasional invalid tool.

scripts/dashboard/build-dashboard.ts (2)

179-187: Consider parallelizing batched processing.

The loop processes discussions in batches sequentially, waiting for each batch before proceeding. If feasible, you could run the batches in parallel (with a safe concurrency limit) to speed up execution without hitting rate limits.


226-240: Remove the unused async keyword.

This function doesn’t use await or perform asynchronous operations. Simplify by removing the async keyword to reflect its synchronous nature.

- async function mapGoodFirstIssues(issues: GoodFirstIssues[]): Promise<MappedIssue[]> {
+ function mapGoodFirstIssues(issues: GoodFirstIssues[]): MappedIssue[] {
config/all-tags.json (1)

1-20: Consistent Color Standardization & Duplicate Scala Check
The changes standardize many technology entries to use the colors bg-[#61d0f2] and border-[#40ccf7], which improves visual consistency. However, please verify whether having a "Scala" entry in the languages array (with its original colors) and another in the technologies array (with the new standardized colors) is intentional. If they represent the same entity, consider consolidating them to avoid confusion.

🛑 Comments failed to post (10)
tests/markdown/check-markdown.test.js (2)

163-166: ⚠️ Potential issue

Remove duplicate test case.

This test case for checking missing frontmatter is identical to the one at lines 159-162 and should be removed.

-  it('should throw an error if frontmatter is missing', () => {
-    const errors = validateBlogs(undefined);
-    expect(errors).toEqual(['Frontmatter is missing']);
-  });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.



150-157: ⚠️ Potential issue

Remove duplicate test case.

This test case for isValidURL is identical to the one at lines 141-148 and should be removed to avoid redundancy.

-  it('should return true or false for URLs', () => {
-    expect(isValidURL('http://example.com')).toBe(true);
-    expect(isValidURL('https://www.example.com')).toBe(true);
-    expect(isValidURL('ftp://ftp.example.com')).toBe(true);
-    expect(isValidURL('invalid-url')).toBe(false);
-    expect(isValidURL('/path/to/file')).toBe(false);
-    expect(isValidURL('www.example.com')).toBe(false);
-  });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.


tests/tools/extract-tools-github.test.js (1)

42-81: ⚠️ Potential issue

Potential logic mismatch in pagination test
The test sets total_count to 100 but expects result.items to have a length of 150, even though only two pages of 50 items each are fetched. Ensure the code or expectations align so the final item count matches the total.

scripts/tools/extract-tools-github.ts (1)

56-56: ⚠️ Potential issue

Fix possible duplication of items

This line potentially adds duplicate items since line 39 already adds the first page of results to allItems.

-    result.data.items.push(...allItems);
+    // We've already accumulated all items in allItems, so we should just update result.data.items
+    result.data.items = allItems;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

    // We've already accumulated all items in allItems, so we should just update result.data.items
    result.data.items = allItems;
tests/build-tools.test.js (1)

87-97: 🛠️ Refactor suggestion

Ensure robust file I/O handling.

This test verifies that write errors are caught. If you anticipate frequent ephemeral or permission-based file issues, consider reinforcing the user-facing error messages or adding retries for better resiliency.

scripts/tools/combine-tools.ts (1)

168-205: 🛠️ Refactor suggestion

Consider re-initializing finalTools inside combineTools.

Since finalTools is declared globally, you are appending and sorting previous results into it. If combineTools might be called multiple times in a single run, re-initialize or clone finalTools inside the function to avoid data overlap.

scripts/build-docs.ts (3)

107-108: 🛠️ Refactor suggestion

Guard against missing items in .find(...) call.

Using ! after the .find(...) result can cause a runtime error if no matching array element is found. Consider handling the possibility of undefined to avoid hard crashes.

For example:

- allChildren[key].item.href = childrenOfAllChildren.find((c) => c.isPrerelease === undefined)!.slug;
+ const foundChild = childrenOfAllChildren.find((c) => c.isPrerelease === undefined);
+ if (foundChild) {
+   allChildren[key].item.href = foundChild.slug;
+ } else {
+   // Handle the case where an appropriate child is not found
+ }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

          if (rootKey === 'reference' && key === 'specification') {
            const foundChild = childrenOfAllChildren.find((c) => c.isPrerelease === undefined);
            if (foundChild) {
              allChildren[key].item.href = foundChild.slug;
            } else {
              // Handle the case where an appropriate child is not found
            }
          }

148-150: ⚠️ Potential issue

Fix the error message concatenation.

Error objects accept only a single string argument, so passing multiple arguments doesn’t include the second value in the message or stack trace. Use string interpolation or a separate error object to preserve the original error details.

- throw new Error('Error in convertDocPosts:', err as Error);
+ const originalError = err as Error;
+ const newError = new Error(`Error in convertDocPosts: ${originalError.message}`);
+ newError.stack = originalError.stack;
+ throw newError;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

  } catch (err) {
    const originalError = err as Error;
    const newError = new Error(`Error in convertDocPosts: ${originalError.message}`);
    newError.stack = originalError.stack;
    throw newError;
  }

204-214: ⚠️ Potential issue

Add boundary checks for index + 2.

In cases where you skip to structuredPosts[index + 2], ensure that index + 2 is within the array’s bounds to avoid out-of-range errors.

if (!structuredPosts[index + 1].isRootElement && !structuredPosts[index + 1].isSection) {
  nextPage = {
    title: structuredPosts[index + 1].title,
    href: structuredPosts[index + 1].slug
  };
} else {
+ if (index + 2 < countDocPages) {
    nextPage = {
      title: `${structuredPosts[index + 1].title} - ${structuredPosts[index + 2].title}`,
      href: structuredPosts[index + 2].slug
    };
+ }
}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

        if (!structuredPosts[index + 1].isRootElement && !structuredPosts[index + 1].isSection) {
          nextPage = {
            title: structuredPosts[index + 1].title,
            href: structuredPosts[index + 1].slug
          };
        } else {
          if (index + 2 < countDocPages) {
            nextPage = {
              title: `${structuredPosts[index + 1].title} - ${structuredPosts[index + 2].title}`,
              href: structuredPosts[index + 2].slug
            };
          }
        }
dashboard.json (1)

179-187: 🛠️ Refactor suggestion

Avoid duplicating the same issue ID in the "goodFirstIssues" array.

The issue with ID "I_kwDOBW5R_c6qAXn5" appears twice, leading to ambiguity and potential data conflicts. Remove or update the duplicate entry.

Also applies to: 245-253

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants