Problem with MPI_SCATTERV
Hi All,I ran into a problem with MPI_SCATTERV. When the sendcounts (scounts) is the same for every process, it works fine, but if the sendcounts is different, (e.g., 1, 2, 3, 4 for 4 processes) there...
View Articlempiexec.hydra -ppn 1 and intel-mpi 4.1.2.040
I have just installed intel-mpi 4.1.2.040 onf a cluster... If I used mpiexec.hydra to start jobs one per node... it still spawns processes on all available resources... mpiexec.hydra -ppn 1 hostnameon...
View ArticleMalfunctioning of some MPI subroutines with Ifort Compiler for Windows
Hi,I started recently programming with the MPICH2 library associated to the Intel Visual Fortran composer for Windows. Even if my test code recognize the MPI library, many of the MPI simple subroutines...
View ArticleProblem with MPI_FILE_OPEN on Windows
I'm sorry, it's the second time I ask this question. I'm trying to use the MPI subroutine MPI_FILE_OPEN and even with a very simple code lines it doesn't work and I don't understand why !! The other...
View ArticleIntel Trace Collector Crashing with Large Number of Cores
Dear Support, I am currently running on RedHat Linux 6.2 64-bit with Intel compilers 12.1.0 and Intel MPI 4.0.3.008 over Qlogic Infiniband QDR (PSM). I am also using Intel Trace...
View ArticleIntel MPI: MPI_Comm_connect with I_MPI_FABRICS tmi results in an error
Hello,I have two programs which are connected at runtime via MPI_Comm_connect.If I use the dapl fabrics everything works fine:mpirun -genv I_MPI_FABRICS dapl -np 1 ping.exempirun -genv I_MPI_FABRICS...
View ArticleMulti-threading with Intel MPI
I am trying to establish communication between threads on different processes using Intel MPI.However, it does not seem to work as I would expect it to.Here's the code snippet:...
View ArticleDual-rail MPI binding
Dear expert,I seek confirmation that I am doing stuff properly. Here my situation. The new cluster in my institution has two Mellanox Connect-IB cards on each node. Each node is a dual socket six-core...
View ArticleProblem with the use of Intel MPI DAPL option on our Windows PCs cluster
HI,I cannot test the Intel MPI benchmark 4.0 on our Windows Cluster in case of the use of DAPL. The Intel MPI benchmark 4.o works well with the option of TCP. In terms of the Intel MPI library for...
View ArticleProblem with the use of Intel MPI DAPL fabric
Hi, on our Windows PCs cluster, when I tried to test Intel MPI benchmark 4.0 with the use of DAPL fabric, the following error always occurred. Could you tell me the reasons? Thanks in advance.By the...
View ArticleProblem with the use of Intel MPI DAPL fabric
Hi, on our Windows PCs cluster, when I tried to test Intel MPI benchmark 4.0 with the use of DAPL fabric, the following error always occurred. Could you tell me the reasons? Thanks in advance.By the...
View ArticleSMPD : create process failed
Hi,I'm actually trying to parallylize my fortran code under Composer XE 2013 with the Intel MPI library. I'm using to create my processors the single multiple purpose Daemon (SMPD).The problem is :...
View ArticlePMPI_Bcast: Message truncated,
Hi,I am trying to debug some problems with getting an exe developed by another group in our company to run on Intel MPI. I am using Linux version 4.1.Debug output as below....Does the error indicate a...
View ArticleProblem with OFED dapl2test.exe
Hi, could someone tell me how to fix the following problem? On our Windows PCs cluster, Intel MPI 5.0 and WinOFED 3.2 are installed. I am trying to test if the DAPL fabric works well with Intel MPI....
View ArticleRegarding the use of Intel MPI DAPL fabric under Widnows PCs cluster
We have a Windows PCs InfiniBand cluster with the use of Mellanox M2401G 24-Port 20G IB Switch. But under Windows the Mellanox card does not support DAPL and this point was further confirmed by the...
View ArticleMPI_Comm_spawn fails if called concurrently
I am using Intel(R) MPI Library for Linux* OS, Version 4.1 Update 1 Build 20130522 on a Linux Cluster environment. Running the following script will produce a race condition. All used libraries are...
View ArticleErroneus [pmi_proxy] left behind
My application makes heavy use of MPI_Comm_spawn calls to dynamically create and abandon processes.I am using Intel(R) MPI Library for Linux* OS, Version 4.1 Update 1 Build 20130522 on a Linux Cluster...
View ArticleMPI run crashes on more than one node
Hello,KInd Attn: James Tullos (Intel)I am facing a similar problem like http://software.intel.com/en-us/forums/topic/487208 and http://software.intel.com/en-us/forums/topic/280123#comment-My mpirun...
View ArticleIfort vectorization not efficient
Dear Intel developers,I have a Fortran piece of code where my program spend a lot of times:k=0id = 1do j = start, end do i = 1, ns(j) k = k + 1 if(selectT(lx00(i), j, id) > 1.00) &...
View ArticleMPI_Init crash after setting I_MPI_STATS
After setting I_MPI_STATS=4, Intel MPI 4.1.3 on Linux crashes at MPI_Init*** glibc detected *** free(): invalid pointer: 0x0000000002933384 *** ======= Backtrace: =========...
View Article