Skip to content

Default AWS accounts unusable for KKP? #1462

@xrstf

Description

@xrstf

We created a fresh AWS account to test KKP. After installing KKP 2.23 into it, we started to create the first usercluster, also on AWS, in the same account.

However the nodes never joined the usercluster. We found out that this is because we had not enabled Assign Public IP when creating the MachineDeployment via the KKP dashboard. Once we enabled this, new machines could successfully join the cluster.

Our VPC followed the current documentation (i.e. Enable DNS Hostnames was enabled).

It's unfortunate that KKP's default settings result in a broken cluster.

Metadata

Metadata

Assignees

No one assigned

    Labels

    lifecycle/frozenIndicates that an issue or PR should not be auto-closed due to staleness.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions