Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]starrocks pod got error after scale-in fe component #8752

Open
tianyue86 opened this issue Jan 6, 2025 · 1 comment
Open

[BUG]starrocks pod got error after scale-in fe component #8752

tianyue86 opened this issue Jan 6, 2025 · 1 comment
Assignees
Labels
kind/bug Something isn't working
Milestone

Comments

@tianyue86
Copy link

tianyue86 commented Jan 6, 2025

Describe the env
Kubernetes: v1.31.1-aliyun.1
KubeBlocks: 1.0.0-beta.21
kbcli: 1.0.0-beta.8

To Reproduce
Steps to reproduce the behavior:

  1. Create starrocks cluster using yaml below
    srce.yaml.txt
  2. scale-out/in be fe: succeed
 kbcli cluster list-ops strsce-ixtqee --status all  --namespace default
NAME                                    NAMESPACE   TYPE                CLUSTER         COMPONENT   STATUS    PROGRESS   CREATED-TIME                 
strsce-ixtqee-horizontalscaling-zjrtg   default     HorizontalScaling   strsce-ixtqee   be          Succeed   3/3        Jan 06,2025 17:54  
strsce-ixtqee-horizontalscaling-wpwhd   default     HorizontalScaling   strsce-ixtqee   be          Succeed   3/3        Jan 06,2025  
strsce-ixtqee-horizontalscaling-mfn6h   default     HorizontalScaling   strsce-ixtqee   fe          Succeed   3/3        Jan 06,2025 17:58 
strsce-ixtqee-horizontalscaling-569bg   default     HorizontalScaling   strsce-ixtqee   fe          Succeed   3/3        Jan 06,2025 18:02 
  1. Check pod status
image
kbcli cluster scale-in strsce-ixtqee --auto-approve --force=true --components fe --replicas 3 --namespace default
===>pod got error after scale-in fe component

k get pod | grep strs
strsce-ixtqee-be-0               1/1     Running    0               56m
strsce-ixtqee-be-1               1/1     Running    0               56m
strsce-ixtqee-fe-0               0/1     Error      0               57m
strsce-ixtqee-fe-1               1/1     Running    0               57m
  1. describe pod
k describe pod strsce-ixtqee-fe-0

Events:
  Type    Reason                  Age                 From                     Message
  ----    ------                  ----                ----                     -------
  Normal  Scheduled               57m                 default-scheduler        Successfully assigned default/strsce-ixtqee-fe-0 to cn-zhangjiakou.10.0.0.145
  Normal  SuccessfulAttachVolume  57m                 attachdetach-controller  AttachVolume.Attach succeeded for volume "d-8vbg6dqw5lh0gptd9f8u"
  Normal  AllocIPSucceed          57m                 terway-daemon            Alloc IP 10.0.0.2/24 took 33.341592ms
  Normal  Pulled                  7m6s (x2 over 57m)  kubelet                  Container image "apecloud-registry.cn-zhangjiakou.cr.aliyuncs.com/apecloud/fe-ubuntu:3.3.0" already present on machine
  Normal  Created                 7m6s (x2 over 57m)  kubelet                  Created container fe
  Normal  Started                 7m6s (x2 over 57m)  kubelet                  Started container fe
  1. logs

report-cluster-strsce-ixtqee-2025-01-06-18-22-14

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

@tianyue86 tianyue86 added the kind/bug Something isn't working label Jan 6, 2025
@tianyue86 tianyue86 added this to the Release 1.1 milestone Jan 6, 2025
@shanshanying
Copy link
Contributor

known issue. It is recommened to set FE with 3 replicas for HA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants