Science topics: Computer ScienceLaptop
Science topic
Laptop - Science topic
Explore the latest questions and answers in Laptop, and find Laptop experts.
Questions related to Laptop
I'm trying to run transistor IV measurement, especially Vg sweep. I've been successful in programming other equipments such as Agilent 4156c, Keithley 2400 sourcemeter, etc, so now I'm trying to program Agilent B1500A.
I connected my laptop with B1500A using GPIB-USB-HS, and I installed all drivers needed. I confirmed that B1500 responsed IDN command using NI-VISA Test Panel.
Moreover, I found this website that could help me program B1500A with python in my laptop.
According to this website, I used the code as below and finally I tried to run this, and the error 'VisaIOError: VI_ERROR_TMO (-1073807339): Timeout expired before operation completed.' occured in ''# set data output format (required!)
b1500.data_format(21, mode=1) #call after SMUs are initialized to get names for the channels''.
However, even without the line above,
''# choose measurement mode
b1500.meas_mode('STAIRCASE_SWEEP', *b1500.smu_references) #order in smu_references determines order of measurement'' also results the same error.
This error might caused by the former line, and does anybody know the solutions for this?
I'd really appreciate it if anybody could help me.
Thank you.
-----------------------------------------------------------------------------------------------
import pyvisa
import matplotlib.pyplot as plt
from pymeasure.instruments.agilent import AgilentB1500
# Define your instrument's VISA resource string
instrument_visa_address = "GPIB0::17::INSTR"
# Initialize PyVISA resource manager
rm = pyvisa.ResourceManager()
# Open a connection to the instrument
try:
instrument = rm.open_resource(instrument_visa_address, timeout=100000)
print(f"Connected to instrument: {instrument.query('*IDN?')}")
except pyvisa.VisaIOError as e:
print(f"Failed to connect to instrument: {e}")
exit()
# Initialize Agilent B1500A instrument
b1500 = AgilentB1500(instrument)
# explicitly define r/w terminations; set sufficiently large timeout in milliseconds or None.
b1500=AgilentB1500(instrument_visa_address)
# query SMU config from instrument and initialize all SMU instances
b1500.initialize_all_smus()
# set data output format (required!)
b1500.data_format(21, mode=1) #call after SMUs are initialized to get names for the channels
# choose measurement mode
b1500.meas_mode('STAIRCASE_SWEEP', *b1500.smu_references) #order in smu_references determines order of measurement
# settings for individual SMUs
for smu in b1500.smu_references:
smu.enable() #enable SMU
smu.adc_type = 'HRADC' #set ADC to high-resoultion ADC
smu.meas_range_current = '10uA'
smu.meas_op_mode = 'COMPLIANCE_SIDE' # other choices: Current, Voltage, FORCE_SIDE, COMPLIANCE_AND_FORCE_SIDE
# General Instrument Settings
# b1500.adc_averaging = 1
# b1500.adc_auto_zero = True
b1500.adc_setup('HRADC','AUTO',6)
#b1500.adc_setup('HRADC','PLC',1)
#Sweep Settings
b1500.sweep_timing(0,5,step_delay=0.1) #hold,delay
b1500.sweep_auto_abort(False,post='STOP') #disable auto abort, set post measurement output condition to stop value of sweep
# Sweep Source
nop = 11
Vg_start = -0.5
Vg_stop = 2.0
Vg_steps = 20
Vds_values = [0.0, 0.2, 0.4, 0.6, 0.8, 1.0, 1.2, 1.4, 1.6, 1.8, 2.0]
Vg_values = []
Ids_values = []
for Vg in np.linspace(Vg_start, Vg_stop, Vg_steps):
b1500.smu3.ramp_source('VOLTAGE','Auto Ranging',Vg, stepsize=0.1, pause=20e-3)
#type, mode, range, start, stop, steps, compliance
for Vds in Vds_values:
b1500.smu4.ramp_source('VOLTAGE','Auto Ranging',Vds, stepsize=0.1, pause=20e-3) #type, mode, range, start, stop, steps, compliance
#Start Measurement
b1500.check_errors()
b1500.clear_buffer()
b1500.clear_timer()
b1500.send_trigger()
# read measurement data all at once
b1500.check_idle() #wait until measurement is finished
data = b1500.read_data(2*nop) #Factor 2 because of double sweep
# Append gate voltage and drain current values to lists
Vg_values.append(Vg)
Ids_values.append(data[0][3])
# Plot Vg-Ids graph
plt.figure(figsize=(8, 6))
plt.plot(Vg_values, Ids_values, marker='o', linestyle='-')
plt.xlabel('Gate Voltage (V)')
plt.ylabel('Drain Current (A)')
plt.title('Vg-Ids Characteristics')
plt.grid(True)
plt.show()
'''
#alternatively: read measurement data live
meas = []
for i in range(nop*2):
read_data = b1500.read_channels(4+1) # 4 measurement channels, 1 sweep source (returned due to mode=1 of data_format)
# process live data for plotting etc.
# data format for every channel (status code, channel name e.g. 'SMU1', data name e.g 'Current Measurement (A)', value)
meas.append(read_data)
'''
#sweep constant sources back to 0V
b1500.smu3.ramp_source('VOLTAGE','Auto Ranging',0,stepsize=0.1,pause=20e-3)
b1500.smu4.ramp_source('VOLTAGE','Auto Ranging',0,stepsize=0.1,pause=20e-3)
-----------------------------------------------------------------------------------------------
When I load .pcr file into Fullprof. It is always showing Re-load PCR file.
I have loaded all the values correctly, but it showing the error Re-load PCR file.
The software is also slow in windows 10 latest version 2021.
I am using latest version of Fullprof.
It is working in windows 7, but not in my new Laptop.
Pease help me....
Molecular Organic Framework, Laptop , Prototyping
Hey everyone!
I can't run any calculations on Gaussian 03 and 09. It stopped when running the l302.exe module on my new computer.
Processor: 13th Gen Intel(R) Core(TM) i9-13900HX 2.20 GHz
RAM 32.0 GB
On other computers, it runs fine with the same software.
Thank you very much!!
I have installed the gaussian 09W software on my laptop that is eurocom nightsky ARX 15 that comes with ryzen 9 3950x,16 core processor, when is give the calculation through gauss view the calculations will struck, even though for small calculation for 5 atoms it struck, please give me the suggestions
details:
MIPS USB Cameras provide a quick and easy means of displaying and capturing high-quality video and images on any USB 2.0-equipped desktop or laptop computer running a supported Microsoft® OS.
please send me.
thanks
Karthick
Hi!
I am facing technical problems in MAC PRO laptop regarding editing or typing the mathematical work in the files created through Microsoft WORD. Till date i was using MATHTYPE IN WINDOWS but now the mathematical work of these files is not accessible. I have bought MathType compatible with MAC and WINDOWS but still i am not able to work on it. Please help me and guide me about any other software which i may use in MAC for maths work and which is accessible on Window as well or which my students may use in WINDOWS and it is accessible on mac.
Test the Hypothesis: "DIGITAL DENKEN" based on basic software "PLAI" (polytrope linguistic artificial intelligence) complies with EU-recommendations and EU-rules for multiple AI-softwares within one system.
We will make the software system available for examination. PLAI runs on a commercially available computer. DIGITAL DENKEN includes human workshops combined with software usage down to laptops. PLAI offers at least 12 different usage-approaches. Examine now one of them as a first step.
Our lab has a Leica DCF290 HD camera hooked up to our scope. We used to have a desktop computer to power it and to run the Leica software for taking pictures. The desktop has died and we are trying to find ways to use the camera on laptops. The camera is powered by a firewire connection and we used to have a cord that would plug into a plug similar to a monitor on the desktop. I have contacted Leica and they do not have any adapter to use a laptop or a wall socket as a power source. I was curious to know if anyone has an idea of how we can go about powering this camera without buying an entire desktop computer.
time, lack of computer, laptop, desk top etc..
I am trying to run a protein-ligand molecular dynamic simulation for 100 nanoseconds. I would like to know if it could be run on an 8 GB RAM laptop. Do we need superior computers?
Hey everybody, how can I adjust the CPU time of a computer to another one? I ran a program on a laptop; I have its execution time. I need to estimate how long it may take on another laptop based on the CPU speed and Ram). Is there any formula to estimate it?
I found a linear scale based on the CPU speeds, but I wasn’t sure if it is correct.
Through observations and studies on specific cases of university students and through the behavior of most young people and even the elderly and the great addiction to mobile phones, which have turned into a safe haven for most people. The danger lies in addiction and wasting time and great effort in browsing websites and entertainment that consume time and effort, in addition to controlling behaviors and directing them in the wrong direction, especially for young people. Which poses a serious threat to their sound behaviors and even to their future. And the great social and economic damage resulting from this addiction. All of the above makes us confirm that mobile phones and laptop computers, instead of being useful tools and improving the scientific, social and economic level, have turned into harmful electronic drugs.
I have a single file of 3827 molecules in pdb format. When I am loading the file into pyrx, it can't load the molecules. The remaining time shown 699 hours and keep on increasing. My laptop has 8 processors. I tried many times, waited for 10-15 minutes but the system gets hanged.
I'm seeking to update my laptop to a new one, or desktop. I have found that I can fiddle around with structures with this current 6th gen i7, 8 gb lenovo 700 with these internal graphics cards https://www.techpowerup.com/gpu-specs/hd-graphics-520.c2783. So rendering final >600 dpi publication images or heavy electron density maps is not happening.
Many people prefer Macs and Macbooks for such graphical tasks. I prefer staying on Linux. If someone would suggest a laptop, desktop, or graphics card model (Linux compatible), that makes molecular graphics an enjoyable work task, that would be much appreciated :)!
Hello everyone,
Below is the error i obtained on the log file as Gaussian output.
"
Symmetrizing basis deriv contribution to polar:
IMax=3 JMax=2 DiffMx= 0.00D+00
G2DrvN: will do 41 centers at a time, making 2 passes doing MaxLOS=1.
Calling FoFCou, ICntrl= 3107 FMM=T I1Cent= 0 AccDes= 0.00D+00.
Calling FoFCou, ICntrl= 3107 FMM=T I1Cent= 0 AccDes= 0.00D+00.
FoFDir/FoFCou used for L=0 through L=1.
End of G2Drv Frequency-dependent properties file 721 does not exist.
End of G2Drv Frequency-dependent properties file 722 does not exist.
Internal input file was deleted!
Error termination via Lnk1e at Tue Dec 18 10:01:22 2018.
Error: segmentation violation
rax 0000000000000000, rbx 00000000011b1aa8, rcx ffffffffffffffff
rdx 00000000000066cd, rsp 00007ffca4542598, rbp 00007ffca45425f0
rsi 000000000000000b, rdi 00000000000066cd, r8 00002ae9f0a59180
r9 0000000000000000, r10 00007ffca4542320, r11 0000000000000202
r12 00007ffca4548a40, r13 00007ffca4548a40, r14 0000000000000000
r15 00002ae9f0a5a010
--- traceback not available "
What does this error means? I did run this calculation on the Gaussian09 package installed from in my laptop, there was no any issue and I obtained a relevant output file. But, when i submit the same file in cluster, I'm receiving the above error. Why is this so? Can anyone help to solve this. Thank you so much in advance.
What specifications of laptop arerequired to run long electronic structure and optical properties calculations i.e., hybrid functional methods. Kindly share experience.
These legacy private networks were suitable for connecting laptops to the Internet and for other limited industrial IoT (IIoT) use cases. However, the coverage and security limitations of these networks, their incompatibility with public cellular networks, as well as their high costs of ongoing management have made it difficult for organizations to use these networks for many IIoT applications. How to reconcile the two? Any related papers? Thank you in advance! To see more: https://www.sierrawireless.com/iot-blog/what-are-private-lte-networks/
Dear researchers, I have installed a free version of (Quartus-lite-21.1.1.850-windows) on my laptop but its only work as a simulation.
I need it to program an FPGA board.
I did all the steps as declared in the attached picture
1- Add file
2- Add a device,
but the start button in the programmer window is not active.
I think that was because I didn't have a license for this software.??
Please, could anyone who used this software help me?
Or give any other solution.
Thank you for your help.

I am currently doing my final year project and so i am using autodock tools. But the display on my laptop doesn't seems right as it is grainy and dark. I couldn't see my receptor clearly.
Good day everyone,
How long does it takes to run the analysis using SPH?
I am using a normal laptop.
Thank you.
im working on znsb atom ..i do the band gap, fermi energy, nscf ,dos ,and run for phonon so for that may laptop would not be enough for that so i move the all folders to another one where my windows files hve space but it's found that read only . Now i m trying for run programme for charge density but it can't make input file for that .All downloads r in linux download folder but in that im also unable to make input file for charge density.
so should i change my psuedo path for that and if yes then how can i change my path for programme?
I was working on revising my manuscript by incorporating major comments made by two reviewers. I almost finished the revision. The light was coming and going several times. Because of the dead battery, my laptop restarted several times. I transferred the file from desktop to disc D and memory stick. When I try to open the file it gives me the message " We are sorry. We can not open the file.docx because we found a problem with its content". I tried several recovery software, but any of them didn't work at all. What shall I do, please? I used more than a month to revise and I completely changed the content. Now, I lost all the motive from zero. I am not also in good condition. Would anybody assist me?
Thank you
I was working on Tex studio editor on my laptop. Suddenly my laptop got a problem and restarted automatically. Then I reopened the .tex file but my entire work was lost. It was total blank. I tried to compile and run that blank file which might have corrupted the output .pdf file. I tried several methods like restoring the previous version but didn't work.
Anybody please help me to recover the files?
I am about to start working on 6d pose estimation for Object Grasping based point cloud. In our lab we have the following:
-AUBO i5 industrial Manipulator.
-COBOT 3D Camera that will give us a point cloud of the scene. the camera will be attach to the manipulator in the eye in hand configuration (mounted on the manipulator's gripper(end effector).
Deep learning Based method will be used for 6D pose estimation of the target object.
the 6D pose estimation will be calculated on my laptop, How can I send the final result or the pose estimation to the robot in order to control it and eventually pick and place the target object.
I have a dual axis solar tracker using light detecting sensors (4x) powered by Arduino and positioned by 2 servo motors. The difference between the tracked and static panel is too low. Any ideas why? I am not using MPP just regular power recording through the Arduino to my laptop.
when I qualify analyzed data by Xcalibur program, there’s an error.
I can not open raw file in chromatogram range.
I move the data my laptop but, I can not read that data.
How can I solve this problem?

Hi everyone,
I would like to know if Macbook laptops are able to run softwares commonly used in research (e.g., statistical sofwares, matlab, etc). Does anyone have recommendations to share?
Many thanks,
Is there a Standardized Questionnaire for qualitative in determining laptop battery life, battery life, or Li-Ion battery?
Our qualitative research, "Employed Methods in Maintaining Laptop Battery" needs a standardized questionnaire, but after searching almost everywhere, I cannot find any. Do you know any websites that contain standardized questionnaires? Thank you.
The error created is shown in the photo which i added. Is this a laptop issue or a software issue. I am a beginer user of material studio. Please suggest me what should i do to solve this problem. My laptop is Core i5, 2.4 GHz.

Hi,
I am working on real-time video captioning using deep neural network, and I want to buy a laptop (not desktop) to execute my work and I need to know the minimum hardware specifications required because the pc with GPU is very expensive as the GPU is the key point.
For example, are the following specifications sufficient or do I need more?
core i7 / 8th gen.
RAM 16 GB
SSD hard 256 GB
GPU GTX-1060 6GB RAM (key point)
thanks
I would like to remotely trigger capture start and stop in Vicon using MATLAB on an external laptop, so that I can synchronise a code I have that records Radar data with motion capture data from Vicon. Many sources online, including Vicon, suggest that that this can be done by using UDP packages. These packages need to be sent via a script which I create and then they sent via a network cable to the Vicon System PC from the Matlab PC. Vicon support has provided me this link (https://docs.vicon.com/pages/viewpage.action?pageId=133824766) which gives a C++ example on how to do this (this is from the Shogun software but the details here are the most up to date and will work with Nexus). Any ideas on how to do this in a MATLAB script?
Any help is much appreciated.
Is there any option, so that we can get the results very faster for 1 million generations?
It takes one month to finish the phylogenetic analysis for the same.
Hi. Below are the spec for my current laptop:
Processor: Intel(R) Core(TM) i5-7200U CPU @ 2.50GHz 2.70 GHz
Installed RAM: 8.00 GB
System type: 64-bit operating system, x64-based processor
Should I increase the RAM up to 16GB? What is the spec to make the analysis run faster? Thank you in advance.
Hello everybody,
Is it possible to control (adjust) the charge generated by the PC or laptop USB port?
I mean using for instance a matlab code
I followed some answers from this forum, so I went to control panel and change language and format in my laptop, I have English USA; and mm/dd/yyyy, in my shoreline layer the dates I have been using are with the same format dd/mm/yyyy. but the problem persists. thanks for the help.
I have 44 projects in 24 of them I cannot enter. I have been trying for 2 months now. I thought it was a technical problem on the site RG. Changed devices laptop, computer, cell phone.
I wrote to the RG support service, but received no response. I would be grateful for your thoughts on this matter.

The software will be used for UX and marketing studies where the participants attend remotely by their laptop or PC (with camera on it)
What is the most comprehensive and the best software company offering eye tracking and facial expression analysis ?
When I click to show interactions an error from python shell windows pops up and contains description on what line in the py script has the error. Both MGL tools and python shell have compatible version and I tried asking someone else to try it on their laptop and it works. How do I fix this?

Hi, I am facing a problem in opening a . DTA (Stata) file in R Studio. The data file I am trying to open is around 6 GB in size. I am using a laptop with 4 GB RAM and a Windows Core i3 processor. Do you think that this is not enough to run a 6 GB file in R?
I am seeing the below error: Error: cannot allocate vector of size 16.8 Mb
Any suggestion would be much appreciated.
Best regards,
Mahbub
I already generate files from CHARMM-GUI online tool. Then, I try to run step 4 equilibration using NAMD and then the run stop with messages fatal error and ask to increased pairlistdist. Is there something wrong with ligand or protein or with the laptop I've been use?
Dear researchers,
I have a huge .dta file (9gb), I want to open it in my laptop( 8GB ram, 64-system) using R because I’m not familiar with stata, how can I open it to start the analysis? Is there any code to avoid the memory limit error?
I’m looking forward to hearing from you.
Hi Friends,
I'm using AutoDockTools in MGLtools1.57 on an Ubuntu20.04 workstation
But every time I try "File--Read Molecule"
to read any .pdb file from RSCB PDB or saved from Pymol, or the coverted .pdbqt file
it reports errors like:
Python 2.7.3 (default, Mar 28 2014, 14:28:59)
[GCC 4.6.3] on linux2
Type "copyright", "credits" or "license()" for more information.
== No Subprocess ==
>>> {'gui': None, 'cmd': <Pmv.selectionCommands.MVSelectCommand instance at 0x47deb90>, 'name': 'select'}
{'gui': None, 'cmd': <Pmv.selectionCommands.MVDeSelectCommand instance at 0x47e9098>, 'name': 'deselect'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47de5f0>, 'cmd': <Pmv.selectionCommands.MVClearSelection instance at 0x47e9290>, 'name': 'clearSelection'}
{'gui': None, 'cmd': <Pmv.selectionCommands.MVExpandSelection instance at 0x47e94d0>, 'name': 'expandSelection'}
{'gui': None, 'cmd': <Pmv.selectionCommands.MVSelectAround instance at 0x47e9680>, 'name': 'selectAround'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47de680>, 'cmd': <Pmv.selectionCommands.MVSaveSetCommand instance at 0x47e9830>, 'name': 'saveSet'}
{'gui': None, 'cmd': <Pmv.selectionCommands.MVCreateSetIfNeeded instance at 0x47e9998>, 'name': 'createSetIfNeeded'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47dea28>, 'cmd': <Pmv.selectionCommands.MVInvertSelection instance at 0x47e9b48>, 'name': 'invertSelection'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47de7a0>, 'cmd': <Pmv.selectionCommands.MVSelectSetCommand instance at 0x47e9cf8>, 'name': 'selectSet'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47de7e8>, 'cmd': <Pmv.selectionCommands.MVSelectFromStringCommand instance at 0x47e9ea8>, 'name': 'selectFromString'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47de878>, 'cmd': <Pmv.selectionCommands.MVDirectSelectCommand instance at 0x47ea170>, 'name': 'directSelect'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47de908>, 'cmd': <Pmv.selectionCommands.MVSelectSphericalRegion instance at 0x47ea368>, 'name': 'selectInSphere'}
{'gui': <ViewerFramework.VFCommand.CommandGUI instance at 0x47deb00>, 'cmd': <Pmv.selectionCommands.SelectNoWaterHeteroAtomsCommand instance at 0x47ea518>, 'name': 'selectHeteroAtoms'}
ERROR *****************
Traceback (most recent call last):
File "/home/gxf/mgltools_x86_64Linux2_1.5.7/MGLToolsPckgs/ViewerFramework/📷VF.py", line 941, in tryto result = command( *args, **kw )
File "/home/gxf/mgltools_x86_64Linux2_1.5.7/MGLToolsPckgs/Pmv/📷fileCommands.py", line 810, in doit newparser = PdbParser(filename,modelsAs=modelsAs)
File "/home/gxf/mgltools_x86_64Linux2_1.5.7/MGLToolsPckgs/MolKit/📷pdbParser.py", line 105, in init MoleculeParser.init(self, filename, allLines)
File "/home/gxf/mgltools_x86_64Linux2_1.5.7/MGLToolsPckgs/MolKit/📷moleculeParser.py", line 36, in init self.filename = str(filename)
UnicodeEncodeError: 'ascii' codec can't encode characters in position 10-11: ordinal not in range(128)
Exception in Tkinter callback
Traceback (most recent call last):
File "/home/gxf/mgltools_x86_64Linux2_1.5.7/lib/python2.7/lib-tk/📷Tkinter.py", line 1410, in call return self.func(*args)
File "/home/gxf/mgltools_x86_64Linux2_1.5.7/MGLToolsPckgs/Pmv/📷fileCommands.py", line 614, in guiCallback mols.data.extend(mol.data)
AttributeError: 'str' object has no attribute 'data'
I guess this is probably the encoding issue, but I'm not sure how to solve it.
I think the files are right because they can be read properly on my laptop. Also no problem if you open them in VScode as utf-8 encoding.
Besides, AutoDockTools can read sessions (build on my laptop) without errors.
Has anybody met this situation before?
Help me if you can
Thanks a lot!!!
The window platform is not good for molecular dynamic simulation of protein-ligand complex. I need a machine with high GPU and can run simulation up to 20-30 ns
We tried to install the XRF software on an extra laptop but we found some strange software errors. When we want to connect the instrument to the laptop and push the green light, an error message appears. The error message said that No valid response was received from the instrument. The software was installed from the original CD of the instrument.
I would appreciate it if anyone can help me to solve the problem.


Hi. I'm working on 1000 images of 256x256 dimensions. For segmenting I'm using segnet, unet and deeplabv3 layers. when I trained my algorithms it takes nearly 10 hours of training. I'm using 8GB RAM with a 256GB SSD laptop and MATLAB software for coding. Is there any possibility to speed up training without GPU?
Kindly share your suggestions please.
Hello to all.
I am in process of completing a “pilot study” to evaluate the utility of a novel “instrument cluster” for use in motor vehicles and other dynamic environments. The study design has two independent variables, including the novel instrument cluster and a “traditional” instrument cluster. The single dependent variable is participant response time.
A laptop with a 14” diagonal measure display is used to present stimuli to participants (see below). Participants provide responses to stimuli using the keyboard on the laptop. Software running on the laptop presents stimuli to participants and captures the speed and accuracy of their responses.
Instructions to participants include the request that they respond to stimuli (see below) as quickly as they can. They are also asked to respond accurately. This request is repeated twice during the instructions given to participants. (The rules for responding are also provided, and the participants are shown examples of the two different instrument clusters used in the task.)
The study simulates driving a vehicle. Participants see a road sign, followed two seconds later by an image of one of the instrument clusters. Participants are asked to respond “yes” if the combination of the road sign and the instrument cluster, or the instrument cluster alone, indicates they must “do something” as the driver of the vehicle. They are asked to respond “no” if the road sign and instrument cluster do not require action on their part.
A blank image appears on the laptop’s display for two seconds after the participant enters a response. Then the next road sign/instrument cluster pair is presented. This second pair uses the same instrument cluster as before, but with different “readings.” The second pair also includes a different road sign. This procedure for presenting stimuli and capturing responses continues until the participant has seen and responded to 16 different road sign/instrument cluster pairs. The participant is then allowed to take a short silent break. The process is then repeated, using the other instrument cluster in each pair.
Participants are assigned a “participant number” before they begin the study process. Participants who have an odd participant number see the novel instrument cluster in their first set of sixteen response pairs. Participants with an even participant number see the traditional instrument cluster in their first set of response pairs.
Each road sign is paired with instrument clusters that display the same readings. For example, when a “70 MPH” speed limit sign appears in either set, the novel instrument cluster and the traditional instrument cluster will show the same readings.
The correct response to eight of the item pairs is “yes.” Eight of the item pairs require a “no” response.
Item pairs appear in different orders for the two instrument clusters. These orders were developed using random number generation software.
It has been a number of years since I last needed to calculate statistics to evaluate the results of a study like this one. I believe this might be characterized as a “balanced” repeated measures design with, as above, two independent variables and a single dependent variable. This causes me to want to analyze the data using an appropriate ANOVA method, but I am not confident that that is the better method. I would appreciate input from the members of this learned community.
And then there is the other dependent variable I have not discussed – response accuracy. Obviously, there is an element of “shared” variance attached to this measure. That causes me to wonder if MANOVA is the better analysis to run.
Thank you for taking the time to read this. I will appreciate feedback that is offered.
Among the below options, which one is better for deep learning research?
- ASUS TUF Gaming F15 FX506HE Core i5 11th Gen RTX 3050 Ti 4GB Graphics
- Asus TUF Dash F15 FX516PE Core i5 11th Gen RTX 3050Ti 4GB Graphics
- Asus TUF Dash F15 FX516PE Core i7 11th Gen RTX 3050Ti 4GB Graphics
- Asus TUF Dash F15 FX516PM Core i5 11th Gen RTX3060 6GB Graphics
- MSI GP65 Leopard 10SEK Core i7 10th Gen RTX 2060 6GB Graphics
- MSI GF65 THIN 10UE Core i5 10th Gen RTX 3060 MAX-Q 6GB Graphics
I am an engineering management student at the American University of Beirut. For my thesis, I am doing research on vigilance decrements .The metrics to be collected include the following performance measures and eye tracking metrics.
Today, and during this pandemic, I want to hold experiments online, using the laptop’s webcam. I am seeking a user-friendly software, where participants can watch the video, react (either by a mouse click or keyboard click).
One laptop runs the virtual network constructed by NS3, and the other two laptops are connected to two nodes of the virtual network constructed by NS3, and can communicate with each other. What should I do?
I am working with large quantitative models including tabular data but also text data. I use a platform to manage my code and having access to more computer capacity than my laptop can provide. Who is doing it similarly or do you not see the need for it?
Hello everyone,
I am using NAMD for doing MD simulations but running simulation is taking too much time.
I have a laptop with windows 10 with AMD processor, RAM 8GB, ROM 1TB, GPU 4GB NVIDIA.
How much time it would require to complete the simulation for 100ns (5,00,00,000 numsteps ) with 2fs time step ?
Please do guide for the same.
How do we install Caver 1.0 as a standalone application on our laptop? I tried download as usual, but the system keeps telling me the executable jar file is corrupted or invalid after I extracted from the zip file. What should I do regarding this matter ?

I am trying to simulate the Charpy test using Abaqus, could someone specify the boundary conditions that should be taken and which model should be followed.
Should I use the Johnson Cook's model?
How much time can the simulation take on a 4 core laptop?
I tried reducing the meshing to reduce the simulation time, what effect will it have on the results?
Hi,
Quite a heavier user of the Hayes macro for some years. I have recently installed SPSS 27 on my laptop and the thing is, the Hayes macro seems not to work with this new version. When I go under the tab 'Extensions' and 'Install custom dialogs', I receive a message saying 'PROCESS_version_3.0_by_Andrew_F._Hayes cannot be installed because it does not have a valid syntax template."
Has anyone faced this issue? Any way to deal with it?
Thanks for your help ;-)
Renaud
I am running Molecular Dynamic simulations in my laptop, for 50000000 steps using GROMACS software but it has terminated (after 36000000 steps) before completion. I have used various command lines to continue the run from the terminated point. But ends up in an error stating
Fatal error:
Type mismatch for state entry energy_sum, code precision is double, file
precision is int: incompatible checkpoint formats or corrupted checkpoint
file.
What can I do to continue the simulations.
Folks, can anyone please suggest if there is a technical possibility to connect a FireWire CCD camera (Zeiss HRM) to a laptop using a USB port? Normally, we use a desktop PC with a FireWire slot, however for some applications we would prefer connecting this camera to a laptop which does not have a FireWire slot. Are there any adaptors which can connect FireWire to USB? Many thanks in advance.
Are there any research study/data related to our daily usage (significantly increased during work from home) of Digital Devices in our day-to-day lives, beyond which it maybe harmful to our mental and physical health?
What is the ideal duration one should use digital device for a day?
Kindly share your opinion.
I am looking to establish theoretical frameworks for such research on one to one computing projects.
We need to download or extract health data such as heart rate, b.p, calories burned etc. from smart health-band. However, most of the bands are associated with their makers' app and do not provide the access of my own data. For example, we bought "GoQii Run GPS Fitness Tracker with Heart Rate Monitor" - but it does not give the permission to download the data to laptop, only provides a value of the health parameters in the app's screen.
I need a device which tracks steps, distance, pace, calories burned, heart rate and duration and most importantly the data can be downloaded to my laptop in .csv/ excel worksheet.
Can you pls help me to get this issue resolved?
I am working on Education in Sub Sahara Africa. I recently found some literature and primary data collection on the use of mobile phones in teacher education and enhanced student learning. On one hand mobile phones are cheaper and access to the net is more widely and consistently available than when using older computer technology (including the tablets disseminated via OLPC). Does anyone know of any research which provides evidence that mobile phone based learning is valuable or a distraction from learning in Africa? I have seen Savoirs communs n°17, Digital Services for Education in AfricaDoes anyone have any experience of teaching in Africa that might consider helping me understand things better?
Dear All,
I'm getting this error from time to time and don't know what causing this:
When trying to run/debug an Android app on a real device (Galaxy Samsung S in my case) I'm getting the following error in the Console:
Failed to install *.apk on device *:
timeout Launch canceled!
This is all the Console is telling me. LogCat doesn't provide any information. Eclipse Problems view is not showing any issues.
I tried the following steps with no success:
1. Cleaning the project (Project->Clean)
2. Restarting device, Eclipse, laptop, all of the above...
3. Moving the project to a location without spaces, according to Failed to install apk on device 'emulator-5554': timeout
The app has been debugged in the past on that device many times (app is live on Market), but this problem happens every so often, and is VERY FRUSTRATING...
Any help would be greatly appreciated!
Thanks.
I am looking for a reasonable GPU for a Mobile Workstation (Laptop) which can run most State-of-the-art neural networks for images (Computer Vision). Specifically, if anyone has some information on the VRAM effect for deep learning and recently, introduced Nvidia RTX 2080ti (11GB) and RTX 2080 Q-Max (8GB).
Hey guys..
How are ya all
I hope you're doing well...
Lately, I have been facing a problem on my hp laptop which is running in windows 10 os.
First of all, I have a 320 GB hard disk divided to 3 partitions.
C partition is running fine but D and E are not.
Both D and E have turned to RAW. And when I try to use them or get into them, the computer asks me to format them.
When I click format, Blue screen comes up to the screen and the laptop restarts.
Unfortunately, now I am using only half of the storage I have in my laptop...
and I have no access to the rest of it.
I hope you guys help me.
Thx
Dear colleagues,
Do you think that modern information and communication technologies affect our mental and physical health?
We need to convert tilting movement of 1 finger into scrolling function, so that the touchpad act like a joystick. Is it possible? How I can implement it?
Where can a young researcher apply for a free laptop computer?
I'm a researcher that will be using USCF Chimera, Autodock, and Autodock Vina tools. I've been using an Intel i5 8th Gen processor (i5-8250U) lenovo E480 thinkpad with 8GB of RAM, and I'm planning to install the mol. dock apps here. Clock speed of this laptop is at 1.60 - 1.80 GHz range. Thanks everyone!
In this Covid situation, we starting our class since morning from laptop, our reading materials is provided in electronic media,our books all available in e-media,our assignment provided submitted through e media,our discussions with friends, all programs and training is conducted online, published journal all studying online ,so what will be the way to deal with all this situation.
By the way what is parallel processing?
There are a lot of obstacles (challenges) that limit implementation of on-line (or blended) learning especially in the countries that do not have pre-infrastructure to implement this type of learning. Some of these challenges family culture, weakness in WIFI, platforms' problems, No money to buy laptop...etc. Dear in RG group, I hope to gather more challenges and methods to reduce them.
Dear Community,
May I ask you to participate in a survey to understand the behavioral peculiarities of smartphone usage?
The survey will take about 20 minutes. You will have the best experience with a computer/laptop.
Many thanks in advance!
I am investigating heat pipes used in laptops and am having some trouble determining some of the materials used. from initial research I have determined copper zinc and copper nickel alloys as well as some aluminum alloys are used, however I cannot determine the specific alloys that could be used. can anyone provide some incite?
I challenge my students to put away their laptops in the classroom and use pencil/pen and paper instead. The laptop is often a distraction more than a learning tool. My experience is that this strategy improves both the attention, the learning process and the collaboration between the students. What are your views on studies that claim that writing by hand seem to engage the brain in a better way?
Eva Ose Askvik, Ruud van der Weel og Audrey van der Meer: The Importance of Cursive Handwriting Over Typewriting for Learning in the Classroom: A High-Density EEG Study of 12-Year-Old Children and Young Adults. Frontiers in Psychology, 2020. Doi.org/10.3389/fpsyg.2020.01810
Audrey van der Meer og Ruud van der Weel: Only Three Fingers Write, but the Whole Brain Works†: A High-Density EEG Study Showing Advantages of Drawing Over Typing for Learning. Frontiers of Psychology, 2017. Doi.org/10.3389/fpsyg.2017.00706
https://forskning.no/barn-og-ungdom-hjernen-ntnu/derfor-blir-barn-smartere-av-a-skrive-for-hand/1742792 Students writing by hand seem to get better results and get smarter - what are your views on this?
All input files with bravais lattice that contains an angle doesnt run in my Quantum espresso. i even tried running them in my friends laptop but the crash file displays the same problem.
i hope you can help me out in rectifying this issue. i will attach the input file and the output as well.
thank you!
Sofwan Ibrahim
What will be the impact of the use of tablets, laptops or phones for online lessons among children?
Currently, lessons are 50% or 100% online for children. On the other hand, spychologists speak of the disadvantages of screens for young children which affect the nervous system, affectivity, ........ Can we say that we are in the era of contradictions?
Laptop YES only for the courses.
My current laptop is Lenovo Flex 5, AMD Ryzen 4300G with AMD Radeon Graphics. I have just researched that most people recommended that Nvidia series of graphic cards but my budget can't allow me to purchase any high-powered desktop. I just want the simulation to run (even if it's slow: probably 20ns/day)
Thanks for the help!!
from my experience in this field and knowledge data and system engineering is something fondamental and will be very important in next years as all our behavior and personal become IT system and about data like smart phone and laptop domotics and about algorithm as welll
I want to obtain the running times of some cryptographic operations such as bilinear pairings, point multiplication, elliptic curve scalar multiplication, hash function, etc. using MIRACL cryptographic library on both Laptop and Mobile device.
I need help on how to do this. Any useful assistance will be highly appreciated.
Thanks
Dear Colleague and experts.
I am looking for new laptop with a GPU card for deep learning mission.
is there any criteria to make me choose the proper GPU card based on my dataset size and image resolution. is there any rough estimation to expected training time based on card parameters and dataset size?
Thanks for any help

I began my research pre-COVID19. I was going to collect my data via focus groups, but lockdown restrictions were enforced and it could not happen. Online was not an option due to connectivity issues for some participants. So I had to revert to phone interviews. I did not have a dictaphone so had to resort to recording using my laptop and put participants on speakerphone.
I encountered one difficulty where the laptop only recorded for 15 mins. I can't seem to find any other studies that I used this method and just wondered if anyone else has used this or found studies that have used it. It's pretty old school but the best option considering the circumstances I was in :)
So, I came to observe that even if I use the same dpf and gpf files for my ligands and protein respectively and run docking usinng ADT in two different laptops, the result generated varies slightly in terms of binding energy and interaction patterns. The differences though is not significant but just wanted to understand what might still cause this?
Also, what's the most logical explanation for a particular amino acid residue to be involved in various kind of interactions?
Hello,
I am trying without success to install Discovery Studio Visualizer in some laptops with Ubuntu 18.04 LTS.
Is there a workaround for install Discovery Studio Visualizer (2016 --> up) in Ubuntu 18.04 LTS?
I've tested here and it works great with Ubuntu 14, Ubuntu 16 and Debian 9...
If it's not possible to install it on Ubuntu 18, is there another software for database management? (Create and delete attributes in SDF files; sort for an specific column, etc.)
Best,
Ricardo
Hi, currently I'm searching for an affordable laptop with minimum specs for bioinformatics.
My work will be NGS analysis (ChIP-seq & RNA-seq)
I believe that MacBook is the best choice, but I have some budget limitation and I also opt for a used laptop as well. My lab has MAC desktop for bioinformatics but I will return to my home country soon and I have to continue the analysis at my home.
For now, I have favour on this laptop:
Lenovo ThinkPad W541 Workstation i7 4710MQ, 4-core processor, RAM 16GB, SSD 512 GB, Windows 10 Pro (used).
I think it is good enough for doing NGS analysis.
Hopefully, I can get some comments on this.
Thanks.
I am planning to develop a test rig to control the valve opening and closing. The input signal for the control system is SSI input from absolute encoder and voltage (up to 5V) from pressure sensor (or 4-20mA current). The output from the control system is ON/OFF voltage signal (24VDC) to open and close the valve. Can anyone advise me what are the suitable instruments to be used & to link between all those devices (sensors and valves) to the controller and laptop/PC?
Hi!
I have a script that works fine and produces the expected results. But when I add the two lines:
import os
os.chdir('D:\\...')
at the beginning of the script and run it, I receive the following error at the command line:
Global seeds have been assigned.
984 elements have been generated on instance: beam-1
Job job-1: Analysis Input File Processor completed successfully.
Error in job job-1: SIM database file is not a valid SIM file: C:\Users\AVAJANG\AppData\Local\Temp\MahdiAahi_job-1_14852\ScratchGenericSystem000001BE286972201.sim
Job job-1: Abaqus/Standard aborted due to errors.
I used to use the same script on another laptop with win 7 and python 2.7 and abaqus 6.14.2 installed, and no errors would pop up. But now on the current system with win 10 and abaqus 2018, I have to deal with the mentioned error!
Is there any way to fix this?
It would really be helpful if someone told me what to do!
Thanks.
Let's face the fact that upto this present time, majority of the students who are in the public schools (state university/college) are not capable of providing themselves with laptops, tablets, or high-tech phones which proved to be very useful in their studies. Others have celphones but not as sophisticated like others have, it's only for them to have an immediate communication with others (family most likely) as necessary or needed but cannot be used for other things. Others can only comply with their activities requiring technology use through the help of their classmates' gadgets. How these poor students comply with online activities when they have no means to use these technologies as they are far from their able-classmates or when there is no near internet shops in their places?
Hello All,
I'm new to the bioinformatics field and I really need your help, I'd be doing some sequence analysis , and my own laptop specs would not help so I would be using "google colab".
My question is:
Is there a way to directly download my FASTQ data set from NCBI to my google drive ??
if yes, please advise how can I do that .. if no, what is the best way to to use "google colab" with fastq files dataset from you experience.
Thank you very much in advance,
I am working with undergraduate students in an online mode (due to the COVID-19) instead of in-person as I have in past semesters. A few are having trouble running their samples, which they obtained from the literature. So far when I have shared input files they cannot run them on their laptops. I think it may be because the files references the databases that are resident on my laptop in my download. Any suggestions?
Hello everyone,
How would I answer this question? is it asking for a specific program or device?
"I will be storing the data on my password locked laptop"?
Hi all, I have developed an image map with specific coordinates in HTML which is working properly on laptop device, but its not mobile compatible. How to make it according to mobile responsive template?
Thanks
Hi everyone, may I know what is the suitable interfaces to connect dSPACE CLP1104 to the laptop. Can anyone suggest what model (compatible model and brand) of the interfaces and where can I get it? How much the price of the interfaces? Also, can anyone show how to link the controller to the laptop? Thank you.
I'm trying to come up with a method for presenting white noise signals to a participant during TMS-EEG without using a laptop. If you have experience with this what devices do you use? Or should I just be expecting to run e-prime/matlab?
Thanks.
I am attempting to run my SNP dataset in STRUCTURE, at 10K burnin and 10K MCMC reps. I have 7864 loci and 84 individuals. so I ran for K=1 to K=20, at 5 iterations. Now, I am waiting days on my laptop for STRUCTURE to run! I saw on another post that there is a faster version of STRUCTURE in the works - do you have any recommendations on how to speed up the process?
I am planning to develop a test rig to control the valve opening and closing. The input signal for the control system is SSI input from absolute encoder and voltage (up to 5V) from pressure sensor (or 4-20mA current). The output from the control system is ON/OFF voltage signal (24VDC) to open and close the valve. Can anyone advise me what are the suitable instruments to be used & to link between all those devices (sensors and valves) to the controller and laptop/PC?
I forgot to mention about the system requirement:
1) The control system needs to measure the absolute encoder running at a pretty high speed of maximum of 2000 rpm.
2) It also need to capture / read data of approximately every 0.1ms.
Dear Colleague
Could you share your experiences of using mobile devices(smartphones, laptops, tablets computers, iPad, etc.) in your professional development activities?
I want to gain insights from your personal experiences.
Regards
Krishna Parajuli
Hello ,
I am making a 2D simulation using fluent
at first i run the simulation on my i7 laptop (4cores 8 threads) than i got a z820 workstation with 2 processores (24 cores 48 threads)and i run the exact same simulation on the workstation and i was schoked that it's only 2 times faster than my laptop.
the setting i chose for the laptop and workstation are :
1/-laptop -parallel -double precision -6 processors
2/-workstation - parallel -double precision -36 processors
How in the World that a workstation in only 2 times faster than my laptop ?!!
Do you have any idea about what may the problem be ?
PS:
both the laptop and workstation have the same ansys version 2019 R3.both the simulations setting are the exact same except for the number of precessors used for parallel solving.
Many Thanks
I am looking for an application and methods of collecting data on smartphone usage for psychological research.
I am interested in how long students use their smartphones and laptops and what they are doing. I have obviously looked at the iOS Screentime function and the Android "Digital Wellbeing" function, but they do not allow for easy exportation of the data. Additionally, I am only interested in their usage during school time which poses a problem since the Screentime function only indicates weekly or daily averages.
If you have any pointers or experiences I would be very grateful for a reply :)