error impi 4.1.2 dapl error
Dear Experts, After an update of our Cluster I started receiving dapl errors. I am compiling my fortran code with impi 4.1.2. The error occurs if I try running my code with complicated models having...
View ArticleError from mpiexec - readv failed - errno 9 (Bad file descriptor)
Hi,I am running an MPI application using IntelMPI. I could run this application using 1process successfully. While running it with more than 1process I am getting the following error. 1: Fatal error in...
View ArticleIntel Cluster Studio XE on VMs
Hi everyone,I was wondering if anyone had experience running the Intel MPI library on a cluster consisting entirely of VMs. Is this possible? Which library or product is appropriate for this need?...
View ArticleUnable to generate trace file with mpirun
Hello. I've been attempting to follow the instructions on this page https://software.intel.com/sites/products/documentation/hpc/ics/itac/81/Getting_Started.htm using Intel mpirun but have either not...
View ArticleIncorrect program or MPI implementation bug?
Hi,Below is a simple reproduction case for the issue we're facing:#include "stdio.h" #include "mpi.h" #include "stdlib.h" int main(int argc, char* argv[]) { int rank; MPI_Group group;...
View ArticleProblem launching windows console application using createprocess from mpi...
All,I have a working console application( call myApp) which is launched using mpiexec on windows 2008server. It works fine. However I am now adding a feature where in I want to launch a new...
View ArticlePerformance degradation with MPI_THREAD_MULTIPLE
Hello,We've been facing some scalability issues with our MPI-based distributed application, so we decided to write a simple microbenchmark program, mimicking the same "all-to-all, pairwise"...
View ArticleMemory leak in mpiu_shm_wrappers.h?
Hi all,I am working on an MPI application that appears to be leaking memory slowly.After running one the MPI ranks under Inspector XE with the mi1 analysis type, I see a memory leak reported in...
View ArticleInstallation Problem - MOST of the Cluster is Good but Phi Nodes Not Working
I'm hoping you can help with the best way to fix this installation problem:We have a centrally-installed Intel Cluster Studio XE running on an HP DL380 server and 128 compute nodes - HP SL230s servers....
View Articlempiexec requires administrative privileges on Windows
Our organization has recently purchased Intel Cluster Studio. We have set up a few test cases to ensure that everything is working as expected and have come across an issue with using mpiexec on our...
View ArticleNO-IF paradigm for high-performance code.
Hi there, Amici (Friends).i'd like to discuss here how to reduce conditional branches in algos (algorithms) to speed-up our codes ;D recently, i wrote QSort-based sorting to research the possible...
View ArticleNew Intel® Xeon Phi™ Cluster Integration webinar
Intel® Xeon Phi™ Cluster Integration – A hands-on Introduction Join us for a Webinar on August 12Space is limited. Reserve your Webinar seat now.In this 4-hour course led by an instructor the...
View ArticleMPI 4.1 fails to end gracefully when ranks > 2000
I am testing Intel MPI 4.1 with test.c (the provided test program).Whenever I run > 2000 ranks the program executes correctly but fails to end gracefully.Running:mpiexec.hydra -n 2001 -genv...
View ArticleHow to use Intel MPI 5.0?
Dear all,I noticed that Intel has release Intel MPI 5.0 and I downloaded latest version to test it. The command I used is as follows (it works with latest Intel MPI 4.1.x):mpiexec -wdir...
View ArticleAsk for suggest to configure and run parallel program in cluster
Dear all,I have a cluster with two kinds of nodes joined into parallel calculation: the first kind is the nodes with 2 CPUs and 4 cores in every CPU, the memory in every node is 32 GB, the second kind...
View ArticleINTEL-MPI-5.0: Bug in MPI-3 shared-memory allocation...
Dear developers of Intel-MPI,First of all: Congratulations, that INTEL-MPI now supports also MPI-3 !However, I found a bug in INTEL-MPI-5.0 when running the MPI-3 shared memory feature (calling...
View ArticleINTEL-MPI-5.0: -prepend-rank on the mpirun command line does not work
Dear developers of Intel-MPI,I found, that the helpful option -prepend-rank does not work when launching a parallelized Ftn-code with mpirun when using INTEL MPI-5.0 : mpirun -binding...
View ArticleIntel MPI Using
Hello everyoneFirst i have to provide this information:1- i have instaled the latest version of Intel MPI.2- i have to use it through Ansys HFSS 15 x64 which is a EM-software.3- HFSS dont have any...
View ArticleMPI Rank Binding
Hello all,Intel MPI 4.1.3 on RHEL6.4: trying to bind ranks in two simple fashions:(a) 2 ranks to the same processor socket and (b) 2 ranks to different processor sockets.Looking at the Intel MPI...
View Articleqdel not killing all processes started under Intel MPI
Hi, when we run using Intel MPI with Hydra process manager (in a script submitted with qsub-- this is with OGS/GE 2011.11p1 on ROCKS 6.1 on a small blade cluster), qdel does not fully kill the job...
View Article