Ticket 13242 - job_container/tmpfs and xauthority
Summary: job_container/tmpfs and xauthority
Status: RESOLVED FIXED
Alias: None
Product: Slurm
Classification: Unclassified
Component: Configuration (show other tickets)
Version: 22.05.x
Hardware: Linux Linux
: 4 - Minor Issue
Assignee: Marcin Stolarek
QA Contact:
URL:
Depends on:
Blocks:
 
Reported: 2022-01-22 05:07 MST by Manuel Holtgrewe
Modified: 2022-11-09 07:42 MST (History)
4 users (show)

See Also:
Site: Berlin Institute of Health
Slinky Site: ---
Alineos Sites: ---
Atos/Eviden Sites: ---
Confidential Site: ---
Coreweave sites: ---
Cray Sites: ---
DS9 clusters: ---
Google sites: ---
HPCnow Sites: ---
HPE Sites: ---
IBM Sites: ---
NOAA SIte: ---
NoveTech Sites: ---
Nvidia HWinf-CS Sites: ---
OCF Sites: ---
Recursion Pharma Sites: ---
SFW Sites: ---
SNIC sites: ---
Tzag Elita Sites: ---
Linux Distro: ---
Machine Name:
CLE Version:
Version Fixed: 22.05.0
Target Release: ---
DevPrio: ---
Emory-Cloud Sites: ---


Attachments

Note You need to log in before you can comment on or make changes to this ticket.
Description Manuel Holtgrewe 2022-01-22 05:07:06 MST
I'm using job_container/tmpfs. Users have problems with X11 connections because they cannot see their xauthority in /tmp.

Can this be fixed on an init script?

Thanks,
Manuel
Comment 2 Marcin Stolarek 2022-01-26 05:57:44 MST
Manuel,

I'm looking into the details of what you actually asked, but just wanted to check if you are aware of:
>X11Parameters=home_xauthority[1] ?

This makes use of a default location of the file (~/.Xauthority) mitigating the JobContainerType=job_container/tmpfs issue you run into.

Obviously, if the home directory resides on shared storage using --x11=all (the default) will result in multiple clients trying to lock the same file, which may be slow for jobs running on multiple hosts.

cheers,
Marcin
[1]https://slurm.schedmd.com/slurm.conf.html#OPT_home_xauthority
Comment 27 Marcin Stolarek 2022-05-16 05:11:57 MDT
Manuel,

I'd like to let you know that a fix to the bug[1] was merged to our main repository. It will be part of Slurm 22.05 release.

Let me know if you have any questions. In case of no reply, I'll close the bug report as fixed.

cheers,
Marcin
[1]https://github.com/SchedMD/slurm/commit/2b2a11dc8794234d3c452ba8021d0ca32811f3a4
Comment 28 Manuel Holtgrewe 2022-05-16 05:39:49 MDT
Dear sender, thank you for your email! I am out of office until June 6 without access to my email. I will read your email when I am back. Please contact hpc-helpdesk@bih-charite.de for anything related to HPC and cubi-helpdesk@bih-charite.de for everything else. In urgent cases please contact dieter.beule@bih-charite.de. Kind Regards, Manuel Holtgrewe
Comment 29 Marcin Stolarek 2022-05-16 05:41:40 MDT
OK, I'm closing the bug as fixed.

Should you have any questions please don't hesitate to reopen.

cheers,
Marcin