Questions related to Real-Time Systems
Does a PhD degree in Management really helps an employee to move towards CEO position in corporate? What is your Real-time Experience and opinion?
Hi, I am currently working on a project which requires me to implement the ICA algorithm in real time, to be specific, I am trying to separate the noise from the audio. When I perform the algorithm in offline, it works fine despite the amplitude of the separated audio is a bit soft. However, when I implement it in real-time, the separated audio becomes very soft. Any source code that I could refer to solve this kind of problem. Thanks.
I would be grateful if anybody could point me to published research discussing the role of public administrations in freight transport planning/policy and what data they collect and use to base their policy upon, especially if such data is from real-time systems (perhaps once summarised). Please note that my interest is for intercity freight transport particularly.
Very many thanks!
My colleagues and I intend to implement a real-time controller. So that a simple real first-order system is controlled by a controller designed in MATLAB/Simulink environment (Rapid Control Prototyping). We would be grateful if you could help with the following:
Is this possible with the help of dSPACE devices? And in that case, with what tools? For example, is the DS1202 MicroLab suitable by itself?
In general, what configuration is proposed for such a system in real-time modeling? (In the academic field)
We would also be grateful if you could suggest an example (project, article, book, schematics, etc.) in this regard. Unfortunately, with so much searching, most of what I found was just PWM generating for power electronic converters.
We are also interested in any advice and experience you have had in this field.
Thanks & Regards.
The app itdoesn´t start correctly (0xc000007b). Press click in Accep to close it.
The app was made in VC10.
I am working on an Edge Framework which aims to execute tasks/processes in real-time and to offload the tasks to other neighboring Edge Servers in case the current server does not have enough resources, or likely to miss the deadline.
There are several architecture available from e.g. OpenFog Consortium or EdgeX Foundry. Additionally, simulators such as iFogSim, EdgeCloudSim (which are based on CloudSim), EmuFog, or YAFS exist. However, as far as I understood, they rely on best effort rather than real-time reaction.
Are there any existing solutions in the same direction, i.e. real-time execution in a distributed network?
I have read some papers about the Model Predictive Control. As I know, MPC mainly update the optimal solutions based on the updated initial condition, i.e. repeated optimal control. As real-time optimal control also do the repeated optimal control, what is the difference between MPC and Real-time optimal control?
Please help me if you know something. Thanks.
Looking for a case study where severe influx and loss events were experienced even when a MPD system is installed and active. Interested in the sequence of events and mitigation steps.
A real well example, not theoretical discussions. A big shout out to the petroleum engineers, drilling engineers, MPD experts and wellsite folks.
Thanks in advance.
I’m trying to implement EKF in my Real-time hand motion capturing algorithm. I have surveyed some paper or source code, but I can’t understand how to define a correct x state. Someone define [velocity(X);position(X);velocity(Y);position(Y)] or [acceleration(X); acceleration(Y)]in X state matrix. However I can't confirm if they are suitable for my system.
Could anyone tell me how to choose a correct X state matrix?
Input data: 3D acceleration (x,y,z).
Xk = AXk-1 + wk-1
Zk = HXk + vk
wk-1 and vk are Gaussian white noise.
Frameworks such as Apache Storm, Flink, Heron and Spark were developed to run on clusters or cloud. These such kinds of infrastructures do not have memory, CPU and bandwidth limitations. In contrast, computing resources at the network edge are constrained regarding their capabilities. I am aware of the Apache Edgent and Nifi frameworks. However, they were conceived to run locally on a single computing resource. If you want to run them in a distributed infrastructure, you might create your own stack of components (broker + framework).
The data stream engines are characterized by receiving and processing an unbounded data stream using the available resources (i.e. memory, processors, etc.). Suppose that each data stream comes with a certain tagging, which allows you to identify the related concept with a given measure. The point is the way to recognize from previous pieces of knowledge or experiences from some repository (e.g. an Organizational Memory) patterns or situations in order to avoid risk situations or catch a given event.
In a data stream context...What do you think is the best way to represent the knowledge in order to allow dynamic reasoning on the fly at the moment in which each measure is read and matched with its concept?
I used two samples one is infected to HBV and another not. when I had checked the Real-time amplification data it showed HBV specific primers amplification in both samples infected or non-infected while its amplification should be only from the infected Sample.
Reaction volume = 6uL
( Sybr Green =2.5 ul, primers = 1uL(10uM), template = 1uM and ddH2O = 1.5uL)
Reaction condition ;
Initially 95 degree for 10 minutes
Denaturation at 95 degrees for 15 sec
Annealing at 56 degrees for 1 minute
Extention at 72 degrees for 30 sec
Melting Curve ; 95, 60, 95 degree for 15, 60, 15 sec respectively.
Primers Tm between 59 to 62 degree and Product length from 100 to 150
We perform an experiment to study agglomeration of small particles in zero gravity. For this, we need to control particle size distribution in real-time (the camera framerate is 50 fps, 8-bit grey image). Can you advise enough fast algorithm for localisation, counting and measurement of particle size? See one image attached - they say, it is better to see once than to hear one hundred times.
Thank you all in advance
Brussels Free University
For the temporal guarantee of some applications, many authors recommend a busload limit, but with new protocols such as CAN-FD, many changes occur. References mention that is 30%, but I believe that it changed. Please provide some recent references about it.
I have always used SYBR® Green *10,000X DMSO Solution* in my qPCR (real time) Mastermix. Has anyone used something like this instead, which appears to not contain DMSO and be designed for PCR? It's a lot more expensive. https://www.aatbio.com/products/cyber-green-sybr-green-10-000x-aqueous-pcr-solution?unit=17592, particularly compared to their version of Sybr green in DMSO https://www.aatbio.com/products/cyber-green-nucleic-acid-gel-stain-sybr-green-10-000x-dmso-solution
If a very high % of higher ed. students think: "open my brain and just pour the knowledge in" we may be doomed in discovering things for bad sciences (with no one willing to look at whole systems of understanding -- though they do not work).
There has always been a disturbing % of students (including ones who have become professors) that had this basic attitude and approach. Now, in this iPhone, etc age, it seems the % may have reached "critical mass" for hopelessness.
The good news: one or a few people could process a whole new system and investigate it (these students being among some very rare subset). These students (several) could make entire good careers out of such work. They may well occupy some seats on a plane to Oslo some day too. AND:
Frankly: analytic professors OWE THE WHOLE WORLD SUCH ANALYSIS for penance for their false persuading assertions that have messed up behavioral science FOR 100 YEARS !!
hope you all are fine
I am working on Intrusion Detection System (IDS), so to implement it on real time on my system i need some help regarding that,
how my system can decide that the packets on network in real time scenarios is either malicious or normal one.
Anyone here who is having an idea of Real time Feature Selection/Extraction/Detection (on run time not on Offline/Virtual bed test setup ). Any good method/Algorithm/any good Journal reference might help me?
your little help would be appreciated
i do a study for a gene expression in HCC patients compared to healthy person using real time pcr. i have analyzed ct for gene of interest and house keeping gene in each sample either HCC or healthy persons. i have calculated delta ct for each sample by subtracting ct of housekeeping gene from ct of gene of interest. now i want to calculate delta delta ct. i didn't make a calibrator through my run. some colleagues suggested to consider the sample (either from patients or healthy) of highest ct is my calibrator to be subtracted from all the samples to calculate delta delta ct. i have add my result in excel sheet.
There is a set of data from an emission measurement sensor sorted by the time they were measured.
Now I need to know how I can find the time pattern when a sensor goes out of calibration?
I want to use a real time model for calibration of emission measurement sensors in automobiles.
Does anyone have any experience with real-time GIS applications using open source tools and software? The aim is visualization of sensor data and moving objects. What kind of solutions are available? ESRI has ArcGIS GeoEvent Server but is there something similar available in open source field?
A frequent application is power electronics rapid prototyping, but many other are possible. A closer look for actual applications in the industry seems to be rare. This discussion can be highly useful for tackling practical applications.
I faced some problem in constructing a standard curve for my gene of interest. The GOI is somehow a low copy number genes. Therefore, I'm going to increase the starting amount of cDNA by 500 ng per tube reaction (which mean during DNase treatment , the starting total amount of RNA will be 5 ug).
I am doing qPCR using two step kit . I used iScript Reverse Transcription by Biorad to transcribed to the cDNA, but I am wondering whether I need to increase this reverse transcript volume as the volume of cDNA concentration will be increase to the total of 5 ug? I am still new to this real-time world.
Attached herewith is the protocol sheet for your reference. Thank you.
Our lab has been looking for this answer for a few years now. Currently, we can collect heart rate data but it cannot be accessed until post-test. We are looking for suggestions on a method to obtain this data and be able to view it in real-time.
I have regarding the real-time communication with KRC 4 KUKA controller questions:
Is the thought correct, that if I use RSI-XML interface, I need time calculated by interpolation cycle rate x amount of the frames to send all frames within one IPO-rate?
Is it correct, that within 1 IPO-Cycle rate, I can send data and get them back?
I want to expire data in database ( its ok with expireAfterSeconds), but i want to use with collection.update>> data are always send data to database in real-time so TTL must be working on update data.. can i do that??
I have treated breast cancer cell line (MCF7) with miRNA mimic. I want to calculate the fold change of miRNA expression between untreated and treated cells. However, the Ct value of miRNA in untreated sample is "undetermined" using real-time qPCR because maybe the miRNA is too low in MCF7. I only have the Ct value of miRNA after treatment with mimic.
How do I calculate the fold change between untreated and treated?
There is only a few reference about Bluetooth high speed (HS) 3.0. My question is why technology is not intreated by researchers?
For example, task periods and computation times can be generated using Stafford's Randomfixedsum algorithm especially for tasks that have implicit deadlines. Can the same algorithm be used to generate arbitrary deadlines? Or, are there other accepted methods of doing so?
Thanks for your reply.
I am trying to optimize my simulator by leveraging real-time compilation. My code is pretty long and complex, but I identified a specific __device__ function whose performances can be strongly improved by removing all global memory accesses.
Does CUDA allow the dynamic compilation and linking of a single __device__ function (not __global__), in order to "override" an existing function?
- The function is a normal __device__ function.
- It is not part of a class nor structure.
- The difference is not the data type, so I cannot rely on templates.
- I actually must change the calculations performed in the function (i.e., propensity calculations) according to the model that I am simulating.
Thank you very much indeed for your answers
Application in fault diagnosis and determine abnormality when used with multivariate statistical process monitoring for manufacturing in real-time or by doing post-processing of historical data. Detail review of typical application cases.
I am performing some Real-Time screening in 96well plates using bacterial lisates as enzyme source.
Some wells shows an amplification curve that after a while goes down (see picture).
Which could be the reason? Could be a low yield of expression (and so amount of the enzyme) the reason or instead should be something reagarding the screening system?
Actually, I am planning to use the concept of priority inheritance in distributed real-time database systems but I am not sure about which one of these two options( Priority Ceiling Protocol and Priority Inheritance Approach) will be best and on what basis.
Need a decent simulator for the following tasks:
1. to schedule real time tasks
2. access to processor power model
3. No VM is required
Your help is highly appreciated...
The system contains multi-processor with different types (Cpu, Gpu, and may contain soft-core)
I'm looking for a simulator for real-time multiprocessor systems to build and test my own schedulers. The simulator should support resource sharing and provide a GUI.
As far as I know it is highly unreccomended to set a proccess priority on MS Windows as Real-TIme, however on which cases it would be reccomended ?
Recently there have been some releases of ARM version of windows (Windows RT).
Is the Win-RT mature enough to start developing real time application on it ?
Where does Windows Stands w.r.t linux and Adroid or any other RT operating system?
What are the Pros and Cons of Windows RT?
I would like to deploy the Simulink PLC Coder generated Structured Text code in Siemens PLC. Right now, I have tried the 'plcdemo_simple_subsystem' example and deployed the generated Structured text code in Siemens S7-318-2-DP PLC. Now, I want to know that how can we assign real/process I/Os to the I/Os in the code? Besides, the code also consists of transfer function, so how is it possible to use the code with real time system? Your help will be very much appreciated. Thank you.
I am looking for an efficient way to interface a Telosb mote with a real time Simulink simulation using the USB port. Although Simulink includes several block for this purpose, all of them require to know before hand the number of bytes that are expected to be read or sent to the serial port.
Unfortunately, I am using a packet protocol of variable length, so I have programmed an simple function that takes care of the job. The drawback is that this function must be called each tick of the real-time simulation clock, as opposed to called when the operating system (Windows) receives new data through the serial port.
What I am looking for is for a way to call my function every time Windows receives new serial data.
I would appreciate any suggestions on this matter.
my doubt is that AC tachogenerator is producing triangular voltage wave, i am giving that voltage signal to ADC port of the Dspace, for the closed loop operation need to convert that signal into speed form. i am having problem to find the frequency of the signal. if anyone having idea about this please help me.
I have some tasks, which are further divided into runnables. Runnables execute as task instances. Runnables have dependencies within the tasks and also to other tasks's runnables. I have the information of deadlines and periods of tasks and the execution order of tasks and runnables i.e I can extract the data flow but the only point where I am stucking is that how can I get the information IF the task instances are executing within the period i.e obeying the deadlines and if not executing withtin the deadline then that task instance will execute in the next cycle or next period.
Any ideas ? Suggestions ?
p.s I dont have timing information for the execution of runnables.
Can someone suggest the real time primers sequence for detection of expression of SGT1, RAR1, MEK2 and MAPKKKa and pathogenesis related proteins?
While the traditional approach is to have a single numerical solver that employs a s uniform step sizing in all parts of models, the alternative, using particular solvers with particular step sizes in different segments of models with different dynamic behaviors is getting more attention. I would like to ask what methods and tools are frequently employed in research community, particularly for the cases that in which MATLAB/Simulink, modelica or Scilab/Xcos is used and realtime execution is a requirement.
I am interested in finding out if anyone of your institutions has implemented Data Distribution Service (DDS) technology whether open source or proprietary for REAL-TIME systems?
Really would appreciate your response.
Usually when we write code for microcontrollers, the code is compiled into a .hex file which is then dumped into the hardware using some other software like Flash Magic.
How can we do this for VxWorks programs? How can I dump a downloadable kernel module onto hardware?
In my experiment, I extracted HCV RNA and used conventional PCR and defined an obvious band, so I took the same sample for real time PCR and the results were negative, this happened twice with 2 different samples with same RNA conc.
What could be the reasons, knowing that I have to quantify my RNA in the sample and the only available tool is real time?
What would be the advantages of employing a Cloud system as a platform to run a hard real-time system where timing predictability is important as the correctness of the system? In this case, what are the main challenges? As an example, guaranteeing timing requirements on the internet is one of the prominent challenges that is discussed in several papers but how about other challenges?
I’ve been trying to implement LQR with state-observer in real-time. Since, I couldn’t manage to implement it using MATLAB real-time workshop, I had to write the C code for LQR and state observer. However, it seems like the C code isn’t working at all and I can’t get my head around it. My first question is, when I design my state-feedback matrix and observer matrix in MATLAB before using it in my C code, do I have to design them (K and L) in discrete-time. I appreciate if someone can give me some advice how to implement LQR+observer in real-time using either MATLAB or C, as I don’t really know why I keep getting issues with implementation.
Inmon, Kimball, Hefesto or another? I'm currently building a data warehouse to pave the way for data mining, the goal of this work is to improve the process of decision-making in education policy. This requires knowing what the best architecture is.
What are the principal architectural similarities and differences betweel real-time and multimedia operating systems? What distinctive requirements does each have? What operating systems are typically used with each?
Could somebody please provide me with some practical examples of real-time systems and also some resources that study the control of real-time systems? Thank you.
Please help me find a real-time operating system (preferably open source) to start off with. Searching through Google provides a lot of options and I am very confused by the different platforms.
My requirement is only that it has to run code in a real-time operating system that spawns 4 threads. All these threads have a different set of code and are completely independent. They don't cause any IO requests. For example, thread 1 generates a number series, thread 2 multiplies a large matrix, and so on. If we consider that each of these threads takes 4, 5, 7, and 9 seconds to complete if executed serially, can the same results be achieved and guaranteed if all the threads are running simultaneously?
I would like to know in the windriver platform what project type (Downloadable Kernel Module, BSP image etc) should be selected for VxWorks Device driver development. And how?
Ballistic impact events occur in microsecond time frame. Flash x-rays are usually not 3D and have a substantial reset time between images. Ideally, it would be most beneficial to have microsecond temporal resolution as well as 3D micron spatial resolution for a continuous interrogation of a millisecond or more. Tough requirement but something to dream for.
We are in the process of setting up a lab experiment in which an inverted pendulum mounted on a cart is to be swung up by a feedback controller. Measurements should be taken every 7ms, and the control signal is to be transmitted every 70ms.
We chose to implement the controller (in C) on Phidgets' single board computer, which runs Debian Linux (SBC3, see http://www.phidgets.com). The measurements are received by interface cards (also bought from Phidgets) which are connected to the SBC via USB.
A) We are currently unable to schedule measurements so that they are taken every 7ms. In fact, using the sleep command of C our sampling times vary wildly between 3ms and 15 ms. That's a problem, since our observer (for the angular velocity of the rod and the cart's velocity) is designed for a fixed sampling rate. We have verified that the computational resources required by the controller and the observer are not responsible for these variations.
B) We are also unable to transmit the control signal on time, as it happens that the controller is unexpectedly inactive for at least 200ms. Increasing the priority of the process that runs the controller relaxes the problem somewhat, but the "dead times" are still too long and too frequent to successfully run the experiment.
1) What are the chances that we can solve the problems described above by installing a suitable real time linux system on our SBC? Is there a specific OS you can recommend?
2) Is it a problem that the measurements are transmitted via USB to the SBC? If so, is it possible to tune the USB interface so as to better comply with real time requirements?
3) If the setup we are currently using (SBC plus interfaces via USB) turns out to be inappropriate for the above experimental setting, what alternatives could you recommend?
I am searching for a simulation environment for power consumption at the sensor node level, or a simulation environment which is configurable and suitable for different sensor node modules (processors, sensors, communication device etc.).
Has anybody used a cycle-accurate simulator for multicore architectures suitable for real-time systems other than gem5? Basic requirements are the ability to output timestamps for executed instructions as well as statistics on cache hits, misses, and other hardware-accelerator events of interest to timing analysis.