Guest name 'TestNode-node' is already in use.
Description
Activity

Former user January 10, 2017 at 2:22 PM
Adding snapshot removal seems like a great idea.

Former user January 10, 2017 at 12:59 PM
I logged into the builder to check - the VM does indeed have a snapshot:
[root@vm0085 ~]# virsh snapshot-list TestNode-node
Name Creation Time State
------------------------------------------------------------
1479982346 2016-11-24 10:12:26 +0000 running
According to [1] I deleted it and then could remove the VM itself:
[root@vm0085 ~]# virsh snapshot-delete TestNode-node 1479982346
Domain snapshot 1479982346 deleted
[root@vm0085 ~]# virsh undefine --remove-all-storage TestNode-node
Storage volume 'vda'(/var/tmp/TestNode-node.qcow2) is not managed by libvirt. Remove it manually.
Domain TestNode-node has been undefined
Can we add this to post- and pre-run cleanups?

Former user January 10, 2017 at 12:41 PM
Looks like we still fail to remove some VMs. In case of the job in question there's a cleanup at the start which also fails:
11:33:21 ++ virsh list --name
11:33:21 ++ xargs -rn1 virsh destroy
11:33:21 ++ virsh list --all --name
11:33:21 ++ xargs -rn1 virsh undefine --remove-all-storage
11:33:21 Storage volume 'hda'(/home/jenkins/workspace/ovirt-node-ng_ovirt-4.0_build-artifacts-el7-x86_64/ovirt-node-ng/build/diskPAVs0d.img) is not managed by libvirt. Remove it manually.
11:33:21 Storage volume 'hdb'(/home/jenkins/workspace/ovirt-node-ng_ovirt-4.0_build-artifacts-el7-x86_64/ovirt-node-ng/boot.iso) is not managed by libvirt. Remove it manually.
11:33:21 Domain LiveOS-8e80dfb0-6269-4691-8804-ecbbe9e2e582 has been undefined
11:33:21
11:33:21 Storage volume 'vda'(/var/tmp/TestNode-node.qcow2) is not managed by libvirt. Remove it manually.
11:33:21 error: Failed to undefine domain TestNode-node
11:33:21 error: Requested operation is not valid: cannot delete inactive domain with 1 snapshots
So a previous job failed to clean up and the same happened in this job again. is there any way to remove VMs with unmanaged volumes? Also, why may they be reported as unmanaged in the first place?

Former user January 10, 2017 at 12:33 PM
Reopening this as we still see this issue so there's still some issue with cleanup:
http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-4.0_build-artifacts-el7-x86_64/208/console

Eyal Edri October 30, 2016 at 4:07 PM
since the patch with the cleanup was merged, setting as resolved.
Please reopen if the problem wasn't fixed.

Sandro Bonazzola October 7, 2016 at 8:26 AM
On Fri, Oct 7, 2016 at 9:30 AM, Fabian Deutsch <fdeutsch@redhat.com> wrote:
> Feel free to review https://gerrit.ovirt.org/#/c/64511/
>
> Rebased, +1; please verify
> - fabian
>
> On Fri, Oct 7, 2016 at 9:26 AM, Sandro Bonazzola <sbonazzo@redhat.com>
> wrote:
>
>>
>> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-arti
>> facts-el7-x86_64/138/
>>
>> ======================================================================
>> ERROR: test suite for <class 'testSanity.TestNode'>
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
>> self.setUp()
>> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in
>> setUp
>> self.setupContext(ancestor)
>> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
>> setupContext
>> try_run(context, names)
>> File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in
>> try_run
>> return func()
>> File "/home/jenkins/workspace/ovirt-node-ng_master_build-artifact
>> s-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 150, in setUpClass
>> 77)
>> File "/home/jenkins/workspace/ovirt-node-ng_master_build-artifact
>> s-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 88, in _start_vm
>> dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
>> File "/home/jenkins/workspace/ovirt-node-ng_master_build-artifact
>> s-el7-x86_64/ovirt-node-ng/tests/virt.py", line 217, in create
>> dom = sh.virt_install(*args, **kwargs)
>> File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in _call_
>> return RunningCommand(cmd, call_args, stdin, stdout, stderr)
>> File "/usr/lib/python2.7/site-packages/sh.py", line 486, in _init_
>> self.wait()
>> File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
>> self.handle_command_exit_code(exit_code)
>> File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
>> handle_command_exit_code
>> raise exc(self.ran, self.process.stdout, self.process.stderr)
>> ErrorReturnCode_1:
>>
>> RAN: '/bin/virt-install --import --print-xml
>> --network=user,model=virtio --noautoconsole --memory=2048 --rng=/dev/random
>> --memballoon=virtio --cpu=host --vcpus=4 --graphics=vnc
>> --watchdog=default,action=poweroff --serial=pty
>> --disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=q
>> cow2,driver_type=qcow2,discard=unmap,cache=unsafe --check=all=off
>> --channel=unix,target_type=virtio,name=local.test.0 --name=TestNode-node'
>>
>> STDOUT:
>>
>>
>> STDERR:
>> ERROR Guest name 'TestNode-node' is already in use.
>>
>> Seems to be a run in a not clean environment. Not sure about what caused
>> this.
>>
>>
>> –
>> Sandro Bonazzola
>> Better technology. Faster innovation. Powered by community collaboration.
>> See how it works at redhat.com
>> <https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
>>
>
>
>
> –
> Fabian Deutsch <fdeutsch@redhat.com>
> RHEV Hypervisor
> Red Hat
>
–
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
<https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>

Fabian Deutsch October 7, 2016 at 7:31 AM
Feel free to review https://gerrit.ovirt.org/#/c/64511/
fabian
On Fri, Oct 7, 2016 at 9:26 AM, Sandro Bonazzola <sbonazzo@redhat.com>
wrote:
>
> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/138/
>
> ======================================================================
> ERROR: test suite for <class 'testSanity.TestNode'>
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
> self.setUp()
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in
> setUp
> self.setupContext(ancestor)
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
> setupContext
> try_run(context, names)
> File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in
> try_run
> return func()
> File "/home/jenkins/workspace/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 150, in
> setUpClass
> 77)
> File "/home/jenkins/workspace/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 88, in
> _start_vm
> dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
> File "/home/jenkins/workspace/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/ovirt-node-ng/tests/virt.py", line 217, in create
> dom = sh.virt_install(*args, **kwargs)
> File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in _call_
> return RunningCommand(cmd, call_args, stdin, stdout, stderr)
> File "/usr/lib/python2.7/site-packages/sh.py", line 486, in _init_
> self.wait()
> File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
> self.handle_command_exit_code(exit_code)
> File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
> handle_command_exit_code
> raise exc(self.ran, self.process.stdout, self.process.stderr)
> ErrorReturnCode_1:
>
> RAN: '/bin/virt-install --import --print-xml --network=user,model=virtio
> --noautoconsole --memory=2048 --rng=/dev/random --memballoon=virtio
> --cpu=host --vcpus=4 --graphics=vnc --watchdog=default,action=poweroff
> --serial=pty --disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=
> qcow2,driver_type=qcow2,discard=unmap,cache=unsafe --check=all=off
> --channel=unix,target_type=virtio,name=local.test.0 --name=TestNode-node'
>
> STDOUT:
>
>
> STDERR:
> ERROR Guest name 'TestNode-node' is already in use.
>
> Seems to be a run in a not clean environment. Not sure about what caused
> this.
>
>
> –
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at redhat.com
> <https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
>
–
Fabian Deutsch <fdeutsch@redhat.com>
RHEV Hypervisor
Red Hat
Details
Assignee
Former userFormer user(Deactivated)Reporter
Sandro BonazzolaSandro BonazzolaPriority
Medium
Details
Details
Assignee

Reporter

http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-el7-x86_64/138/
======================================================================
ERROR: test suite for <class 'testSanity.TestNode'>
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
self.setUp()
File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in setUp
self.setupContext(ancestor)
File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
setupContext
try_run(context, names)
File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in try_run
return func()
File
"/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
line 150, in setUpClass
77)
File
"/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
line 88, in _start_vm
dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
File
"/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/virt.py",
line 217, in create
dom = sh.virt_install(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in _call_
return RunningCommand(cmd, call_args, stdin, stdout, stderr)
File "/usr/lib/python2.7/site-packages/sh.py", line 486, in _init_
self.wait()
File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
self.handle_command_exit_code(exit_code)
File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
handle_command_exit_code
raise exc(self.ran, self.process.stdout, self.process.stderr)
ErrorReturnCode_1:
RAN: '/bin/virt-install --import --print-xml --network=user,model=virtio
--noautoconsole --memory=2048 --rng=/dev/random --memballoon=virtio
--cpu=host --vcpus=4 --graphics=vnc --watchdog=default,action=poweroff
--serial=pty
--disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=qcow2,driver_type=qcow2,discard=unmap,cache=unsafe
--check=all=off --channel=unix,target_type=virtio,name=local.test.0
--name=TestNode-node'
STDOUT:
STDERR:
ERROR Guest name 'TestNode-node' is already in use.
Seems to be a run in a not clean environment. Not sure about what caused
this.
–
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
<https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>