Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature][spark] spark引擎增加对sparkmeasure的支持 #5200

Open
2 tasks done
sjgllgh opened this issue Nov 7, 2024 · 2 comments
Open
2 tasks done

[Feature][spark] spark引擎增加对sparkmeasure的支持 #5200

sjgllgh opened this issue Nov 7, 2024 · 2 comments
Labels

Comments

@sjgllgh
Copy link
Contributor

sjgllgh commented Nov 7, 2024

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Problem Description

  1. spark引擎增加对sparkmeasure的支持,便于对spark任务进行分析

Description

No response

Use case

No response

Solutions

No response

Anything else

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!
@sjgllgh sjgllgh added the feature label Nov 7, 2024
Copy link

github-actions bot commented Nov 7, 2024

😊 Welcome to the Apache Linkis community!!

We are glad that you are contributing by opening this issue.

Please make sure to include all the relevant context.
We will be here shortly.

If you are interested in contributing to our website project, please let us know!
You can check out our contributing guide on
👉 How to Participate in Project Contribution.

Community

WeChat Assistant WeChat Public Account

Mailing Lists

Name Description Subscribe Unsubscribe Archive
[email protected] community activity information subscribe unsubscribe archive

@peacewong
Copy link
Contributor

Good features

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants