|
我还有一个问题想请教您,下面是我试着优化SiC原胞,可是报错了
参数设置
CONTROL
calculation='vc-relax', disk_io='low', prefix='pwscf',
pseudo_dir='//soft/pslibrary.1.0.0/pbe/PSEUDOPOTENTIALS', outdir='./tmp', verbosity='high'
tprnfor=.true., tstress=.true., forc_conv_thr=1.0d-5
/
&SYSTEM
ibrav= 0,
nat= 2, ntyp= 2,
occupations = 'smearing', smearing = 'gauss', degauss = 1.0d-9
ecutwfc= 50, ecutrho = 500,
/
&ELECTRONS
electron_maxstep = 100
conv_thr = 1.0d-9
mixing_mode = 'plain'
mixing_beta = 0.8d0
diagonalization = 'david'
/
&IONS
ion_dynamics='bfgs'
/
&CELL
press_conv_thr=0.1
/
CELL_PARAMETERS (angstrom)
3.0745003223 0.0000000000 0.0000000000
1.5372501612 2.6625953831 0.0000000000
1.5372501612 0.8875317944 2.5103190013
/
ATOMIC_SPECIES
Si 28.08550 Si.pbe-n-kjpaw_psl.1.0.0.UPF
C 12.01070 C.pbe-n-kjpaw_psl.1.0.0.UPF
/
ATOMIC_POSITIONS (angstrom)
Si 0.000000000 0.000000000 0.000000000
C 1.537250161 0.887531794 0.627579750
K_POINTS {automatic}
4 4 4 0 0 0
~
输出文件
[[57252,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:
Module: OpenFabrics (openib)
Host: hmgao
Another transport will be used instead, although this may result in
lower performance.
NOTE: You can disable this warning by setting the MCA parameter
btl_base_warn_component_unused to 0.
--------------------------------------------------------------------------
Program PWSCF v.6.0 (svn rev. 13079) starts on 20Mar2020 at 6:25: 1
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 1 processors
Waiting for input...
Reading input from standard input
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine card_cell_parameters (2):
two occurrences
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
想问一下这是怎么回事,谢谢啦
|
|