Ticket 13242

Summary: job_container/tmpfs and xauthority
Product: Slurm Reporter: Manuel Holtgrewe <manuel.holtgrewe>
Component: ConfigurationAssignee: Marcin Stolarek <cinek>
Status: RESOLVED FIXED QA Contact:
Severity: 4 - Minor Issue    
Priority: --- CC: cinek, felip.moll, lyeager, mcmullan
Version: 22.05.x   
Hardware: Linux   
OS: Linux   
See Also: https://bugs.schedmd.com/show_bug.cgi?id=13085
https://bugs.schedmd.com/show_bug.cgi?id=12361
https://bugs.schedmd.com/show_bug.cgi?id=14595
https://bugs.schedmd.com/show_bug.cgi?id=15074
Site: Berlin Institute of Health Slinky Site: ---
Alineos Sites: --- Atos/Eviden Sites: ---
Confidential Site: --- Coreweave sites: ---
Cray Sites: --- DS9 clusters: ---
Google sites: --- HPCnow Sites: ---
HPE Sites: --- IBM Sites: ---
NOAA SIte: --- NoveTech Sites: ---
Nvidia HWinf-CS Sites: --- OCF Sites: ---
Recursion Pharma Sites: --- SFW Sites: ---
SNIC sites: --- Tzag Elita Sites: ---
Linux Distro: --- Machine Name:
CLE Version: Version Fixed: 22.05.0
Target Release: --- DevPrio: ---
Emory-Cloud Sites: ---

Description Manuel Holtgrewe 2022-01-22 05:07:06 MST
I'm using job_container/tmpfs. Users have problems with X11 connections because they cannot see their xauthority in /tmp.

Can this be fixed on an init script?

Thanks,
Manuel
Comment 2 Marcin Stolarek 2022-01-26 05:57:44 MST
Manuel,

I'm looking into the details of what you actually asked, but just wanted to check if you are aware of:
>X11Parameters=home_xauthority[1] ?

This makes use of a default location of the file (~/.Xauthority) mitigating the JobContainerType=job_container/tmpfs issue you run into.

Obviously, if the home directory resides on shared storage using --x11=all (the default) will result in multiple clients trying to lock the same file, which may be slow for jobs running on multiple hosts.

cheers,
Marcin
[1]https://slurm.schedmd.com/slurm.conf.html#OPT_home_xauthority
Comment 27 Marcin Stolarek 2022-05-16 05:11:57 MDT
Manuel,

I'd like to let you know that a fix to the bug[1] was merged to our main repository. It will be part of Slurm 22.05 release.

Let me know if you have any questions. In case of no reply, I'll close the bug report as fixed.

cheers,
Marcin
[1]https://github.com/SchedMD/slurm/commit/2b2a11dc8794234d3c452ba8021d0ca32811f3a4
Comment 28 Manuel Holtgrewe 2022-05-16 05:39:49 MDT
Dear sender, thank you for your email! I am out of office until June 6 without access to my email. I will read your email when I am back. Please contact hpc-helpdesk@bih-charite.de for anything related to HPC and cubi-helpdesk@bih-charite.de for everything else. In urgent cases please contact dieter.beule@bih-charite.de. Kind Regards, Manuel Holtgrewe
Comment 29 Marcin Stolarek 2022-05-16 05:41:40 MDT
OK, I'm closing the bug as fixed.

Should you have any questions please don't hesitate to reopen.

cheers,
Marcin