Skip to content

Deployment objects using optional envs with the same name are unexpectedly removed during updates - but exist out of the box #93266

Closed
@erharb

Description

@erharb

What happened:
After working around an issue (#93265) that required us to rename a set of envs for the purposes of obtaining values from either an internal or external set of database secrets during, we ran into yet another problem with updating them where the original env names get deleted even though they are present in the manifest being applied to update.

To set the stage: this is allowed out of the box, spring_datasource_username and spring_datasource_password are declared twice as optional, one for each possibility of using internal-db-secret or the external-db-secret from this example snippet:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: for-science
spec:
  ...
  template:
    ...
    spec:
      containers:
      - ...
        env:
        - name: spring_datasource_username
          valueFrom:
            secretKeyRef:
              key: username
              name: internal-db-secret
              optional: true
        - name: spring_datasource_password
          valueFrom:
            secretKeyRef:
              key: password
              name: internal-db-secret
              optional: true
        - name: spring_datasource_username
          valueFrom:
            secretKeyRef:
              key: username
              name: external-db-secret
              optional: true
        - name: spring_datasource_password
          valueFrom:
            secretKeyRef:
              key: password
              name: external-db-secret
              optional: true

When described, all four optional properties are present as expected from an out of the box kubectl apply:

kubectl describe for-science
...
    Environment:
      application_schema:          science
      spring_datasource_username:  <set to the key 'username' in secret 'internal-db-secret'>  Optional: true
      spring_datasource_password:  <set to the key 'password' in secret 'internal-db-secret'>  Optional: true
      spring_datasource_username:  <set to the key 'username' in secret 'external-db-secret'>  Optional: true
      spring_datasource_password:  <set to the key 'password' in secret 'external-db-secret'>  Optional: true

From what we learned in #93265, we knew we needed to make at least one set have unique names, so we first decided to append "_external" to the one set to be named as spring_datasource_username_external and have the entrypoint sort out what to do with it.

        ...
        env:
        - name: spring_datasource_username
          valueFrom:
            secretKeyRef:
              key: username
              name: internal-db-secret
              optional: true
        - name: spring_datasource_password
          valueFrom:
            secretKeyRef:
              key: password
              name: internal-db-secret
              optional: true
        - name: spring_datasource_username_external
          valueFrom:
            secretKeyRef:
              key: username
              name: external-db-secret
              optional: true
        - name: spring_datasource_password_external
          valueFrom:
            secretKeyRef:
              key: password
              name: external-db-secret
              optional: true

Yet again this worked fine out of the box, but when updated from the previous version caused the bizarre removal of the original set of env named spring_datasource_username.

kubectl describe for-science
...
    Environment:
      application_schema:                   science
      spring_datasource_username_external:  <set to the key 'username' in secret 'external-db-secret'>  Optional: true
      spring_datasource_password_external:  <set to the key 'password' in secret 'external-db-secret'>  Optional: true

Applying the same updated manifest a second time put it back in. Since we didn't want to have to tell our users they have to apply the same manifest twice just to update properly, we found we needed to rename both sets of properties to completely resolve the update issues as spring_datasource_username_internal and spring_datasource_password_external:

        ...
        env:
        - name: spring_datasource_username_internal
          valueFrom:
            secretKeyRef:
              key: username
              name: internal-db-secret
              optional: true
        - name: spring_datasource_username_internal
          valueFrom:
            secretKeyRef:
              key: password
              name: internal-db-secret
              optional: true
        - name: spring_datasource_username_external
          valueFrom:
            secretKeyRef:
              key: username
              name: external-db-secret
              optional: true
        - name: spring_datasource_password_external
          valueFrom:
            secretKeyRef:
              key: password
              name: external-db-secret
              optional: true

What you expected to happen:
The update operation should not have deleted the set of optional envs which still exist in the manifest, or perhaps #93265 is the root cause that should never have allowed duplicate sets of optional envs out of the box in the first place.

How to reproduce it (as minimally and precisely as possible):
for-science-update-optional-deleted.zip
Extract the yaml files from the attached zip then run the following commands (into a clean new namespace if desired).

Reproduce the problem running these commands, noting the contents of the described Pod template container environment vars:

kubectl apply -f for-science-ootb.yaml
kubectl apply -f for-science-update-external-only-broken.yaml
kubectl describe for-science

Anything else we need to know?:
To workaround the problem, we renamed both sets of optional envs to be different from the original name spring_datasource_username by appending extra characters, spring_datasource_username_internal and spring_datasource_password_external :

kubectl apply -f for-science-ootb.yaml
kubectl apply -f for-science-update-internal-external-ok.yaml
kubectl describe for-science

Environment:

  • Kubernetes version (use kubectl version): 1.16 (client and server)
  • Cloud provider or hardware configuration: k8s cluster on 5 openstack VMs (1 main, 4 workers)
  • OS (e.g: cat /etc/os-release): Ubuntu 18.04.4 LTS
  • Kernel (e.g. uname -a): WSL Linux 4.4.0-43-Microsoft Unit test coverage in Kubelet is lousy. (~30%) #1-Microsoft Wed Dec 31 14:42:53 PST 2014 x86_64 x86_64 x86_64 GNU/Linux
  • Install tools:
  • Network plugin and version (if this is a network-related bug):
  • Others:

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/bugCategorizes issue or PR as related to a bug.needs-sigIndicates an issue or PR lacks a `sig/foo` label and requires one.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions