Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support config yaml #848

Draft
wants to merge 6 commits into
base: release-1.9
Choose a base branch
from

Conversation

ctrlaltdilj
Copy link

Support Flink 1.19 config.yaml configuration

Flink 1.19 introduced new config.yaml format. There are some dependencies on this new format in 1.19 that prevent old flink-conf.yaml from being able to configure properties.

This PR, adds support to be able to set a config.yaml file which is able to be consumed by the operator as well as deployments.

Brief change log

  • added support to be able to set config.yaml by replacing flink-conf.yaml in the defaultConfiguration

Verifying this change

This change added tests and can be verified as follows:

  • Build image
  • create a values.yaml similar to
  • Extended integration test for recovery after master (JobManager) failure
  • Manually verified the change by running a 4 node cluster with 2 JobManagers and 4 TaskManagers, a stateful streaming program, and killing one JobManager and two TaskManagers during the execution, verifying that recovery happens correctly.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): yes
  • The public API, i.e., is any changes to the CustomResourceDescriptors: no
  • Core observer or reconciler logic that is regularly executed: no

Documentation

  • Does this pull request introduce a new feature? yes
  • If yes, how is the feature documented? not documented, will add

@ctrlaltdilj
Copy link
Author

I do think this would be better supported via a flag on configuration on FlinkDeployment, as it enables operator update independently from FlinkDeployments

I am happy to make that change and would prefer that. Any thoughts?

@mateczagany
Copy link
Contributor

Hi, first of all, thank you for the contribution, but for better observability please try to always create a JIRA before opening a PR.
Unless trying to push a fix for a release candidate, please use the main branch for your PRs. Flink version is already 1.19.1 on the main branch.

@ctrlaltdilj
Copy link
Author

Thanks @mateczagany for all the info! I have requested a JIRA account, and will file a ticket

@ctrlaltdilj
Copy link
Author

@mateczagany
Created following ticket: https://issues.apache.org/jira/browse/FLINK-35744

@ctrlaltdilj ctrlaltdilj marked this pull request as draft July 2, 2024 14:57
@ctrlaltdilj ctrlaltdilj marked this pull request as draft July 2, 2024 14:57
@ctrlaltdilj ctrlaltdilj marked this pull request as draft July 2, 2024 14:57
@@ -154,6 +154,9 @@ defaultConfiguration:
# If set to false, loads just the overrides as in (2).
# This option has not effect, if create is equal to false.
append: true
# If set to true, then will support YAML 1.2 syntax through use of `config.yaml` file.
# If set to false, will make use of deprecated flink-conf.yaml filen.
standardYaml: false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be great if this would be "automatic" based on whether the user specified flink-conf.yaml vs conf.yaml. Do you think that's possible?

@@ -83,7 +83,7 @@ under the License.
<lombok.version>1.18.30</lombok.version>
<commons-lang3.version>3.12.0</commons-lang3.version>
<commons-io.version>2.11.0</commons-io.version>
<flink.version>1.18.1</flink.version>
<flink.version>1.19.0</flink.version>

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we also change the version to 1.19.1 as it was already released?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants