Science topic
LTE - Science topic
Explore the latest questions and answers in LTE, and find LTE experts.
Questions related to LTE
All base station antennas has 2 polarizations, -45deg and +45deg. I assume that is because mobile device antennas are in various orientations, but are the basestation signals totally the same and synchronous through both of those outputs or somehow done in sequences?
am looking forward to simulating an algorithm for Mobile load balancing in LTE network but i don't really know the tools that could be use to do that. Please help me.
What is the suitable path loss model for LTE V2V commmunication?
Any leads with proper explanation would be helpful.
What does mean the (EC-GSM-IoT, LTE-M (LTE for machines) and NB-IoT (Narrowband IoT) In CIOTs( Cellular Internet of Things)?
Understanding the properties of base stations, such as their location, antenna configuration, transmission parameters, and network topology, is crucial for various research endeavors in the field of wireless communications. Accessing this information can provide valuable insights into network performance, interference patterns, and propagation characteristics, enabling researchers to optimize network deployment.
Hello Everyone,
We have LTE tool box option in MATLAB and we can generate the LTE signal for visualization. Is this possible to have an LTE signal in .C or .HDL form? What are the possibilities to get LTE signal in these form, because to develop a hardware and if LTE used as a reference signal then what are the solutions?
Thanks in advance and I welcome to all researchers, Faculties and Industrialist for your participation.
Best Wishes,
Dr. Akhilesh Verma
Nowadays I see many DVBT antennas on the market that announces to have LTE or 5G filters. Is there any techniques to do filter in the yagi design (parasitic element to filter specific frequencies) or is it possible in the baluns only? I have ordered few antennas, but there are no LTE discrete filter, but it looks like they have some kind of filtering characteristics. From me experience I know that discrete LTE or 5G filter would be better, but what are techniques to do this without any components or stubs?
Thank you in advance
I have done considerable research and understand the general workings of an LTE network, but my question is: how does a cellular network such as Verizon, Sprint, etc. function? Examining my local area using AntennaSearch, I discovered that there are 57 towers. If I have a Verizon UE/handset (no Femtocell), I should only be able to connect to the EPC via an eNodeB/tower controlled by Verizon, right? Or could I use any of the 57 towers in my area? From the result on AntennaSearch, none of the listed towers were explicitly controlled by Verizon so I'm assuming that they control towers/eNodeBs under other names.
I would appreciate it if anyone can point me in the right direction with this question. I am moving on to simulating an actual network. Once I know what towers/eNodeBs make up a particular network, it becomes an easy task to generate the topology in ns3. Thanks.
Should it be measured from the moment the Ringtone is initiated from the receiving end or from the moment the Beep sound is initiated from the transmitting end?
For instance, I have seen there is a large difference (in milliseconds) between these two cases. It's confusing as the Beep Sound and the Ringtone is supposed to be initiated at the exact same time as far my understanding.
From the XCAL DRIVE TEST, the captured signalling messages scenarios are pointed below:
in VoLTE case,
the time duration between these two signalling messages are not same:
SIP Tx INVITE and RINGING
and
SIP Rx INVITE and RINGING
(Why?)
Similarly, in case of LTE,
The time gap of these two signalling messages are not same for UE to eNodeB and eNodeB to UE:
extendedServiceRequest- ALERTING
(why?)
Thanks!
These legacy private networks were suitable for connecting laptops to the Internet and for other limited industrial IoT (IIoT) use cases. However, the coverage and security limitations of these networks, their incompatibility with public cellular networks, as well as their high costs of ongoing management have made it difficult for organizations to use these networks for many IIoT applications. How to reconcile the two? Any related papers? Thank you in advance! To see more: https://www.sierrawireless.com/iot-blog/what-are-private-lte-networks/
5G technology has a theoretical peak speed of 20 Gbps, while the peak speed of 4G is only 1 Gbps. 5G also promises lower latency, which can improve the performance of business applications as well as other digital experiences (such as online gaming, videoconferencing, and self-driving cars).
While earlier generations of cellular technology (such as 4G LTE) focused on ensuring connectivity, 5G takes connectivity to the next level by delivering connected experiences from the cloud to clients. 5G networks are virtualized and software-driven, and they exploit cloud technologies.
In looking at charging records, for each MMS message, there appears to be a matching data session open, however, in some cases, I see the data session stay open for an hour after the message was sent and often see multiple MMS messages in one data session. Is there a data channel like the QCI 1 and 5 when a voice call is made?
Hi there,
I am studying the impact of numerology and mini-slots on several proposed algorithms to achieve fairness between 5G NR-U and Wi-Fi networks over unlicensed spectrum bands. However, the coexistence of NR-U and IEEE technology (802.11ad) Wireless Gigabit (WiGig) at 60 GHz bands has been implemented within ns-3 in a 3GPP indoor scenario but there is NO working example for for this coexistence (i.e., I could not compile the code) because the authors have shared the codes, but the users cannot compile it. Can anyone recommend any such codes that I can compile for such scenarios?.
Regards,
Moawiah
I'm trying to make a comparison between a numerical setup and LTE simulation results but I need to know knowing the capacity of the BSs in my experiment so I can make a meaningful comparison. From my simulation using FTP traffic, I have avg user&cell throughputs, spectral efficiency, and resource block utilization, but from what I have found in the literature, I can't quite estimate the capacity of the BS. Help is greatly appreciated in advance.
hi
i want to know more about 5G and LTE channel parameters like (path loss, carrier frequence, max transmission power) etc.
thanks
Hello dear
i have been reading a paper
'A Matching Game Approach for Resource Allocation in Wireless Network Virtualization'
that suggest a scenario which there is BS form infrastructure provider containing 10 subchannel with 180 KHz width, and the BS serve 3 virtual wireless network that contain 5 user and FDMA technology used. So, i wonder about how do subchannel distributed in virtual wireless network?
Hello experts,
I hope you're good.
I am trying to generate non binary (optimized) lfrs sequence on the fly with c.
But I am missing something, I don't get what ...
attached is the code ....
Are the simulation results giving a good impact on real-time problem-solving?
Dear researchers;
I am looking for a simulator that gives me the ability to generate a heterogeneous wireless network environment composed of radio access technologies, such as 3G, 4G, 5G, Wifi, etc.
As it is known that the electron temperature Te is a vital parameter to describe plasmas properties and it is higher than the neutral temperature Tg in cold plasma so that the LTE is no longer valid. So, the diagnosis of Te by using the LTE approximation method is not credible.
Kindly provide help on this matter.
Dear All.
I am working in LAA WiFi coexistence scenario. For my work, I need to calculate the Latency/Delay in a different way.
From my understanding, the way current solution is calculating the delay is that it is monitoring the time between the packet was first transmitted and last seen from the receiver's perspective. It is done in the FlowMonitor class where you calculate all the delay into delaysum and calculate the average delay.
Now, what I want to do is that, for every received packets, I want to set a timer when the packet is being generated and add with the individual delay values and calculate the avg delay. Now, as I am talking about the LAA-WiFi coexistence, I cannot make a solution that only works for Wi-Fi. So I have to solve this problem from the base, I mean, the Packet class.
Did anyone work in this problem before? Need some expert suggestions. And Thank you in advance for your kind assistance.
I am in search of any advanced simulator for testing the Coexistence of LTE and WiFi scenario in different cases.
I am trying to find resources where I can study the comparison of the various features of the Random access channel in LTE using simulated and real-world measurement data. It would be great if you can help me to find any resource on that.
Searching a Research tool for doing the research of LTE and Wi-Fi Coexistence with fast speed and with good accuracy.
I have gone through some of the simulators like Vienna simulator, LTE Pro, NS3. But I am not finding comfortable to proceed with my work. Do you suggest me the best simulator?
Apart from experiments available for WiFi, IoT, LTE etc., we also want our students to work on experiments on 5G. Does someone have a set of pre-built NetSim experiments which we can quickly adopt for our course? If not, any suggestions on what kind of experiments can be done would be valuable? Thank you.
I need to know the appropriate tool for modeling, simulation, and analysis of wireless communication systems
I am trying to understand what is included in Charging Data Records and when they are produced. There much a principles document and a Stage 2 and 3 documents in 3GPP but I'm not finding it.
Also, any good text on Charging in LTE would be appreciated. Thank you.
I see in an LTE data session that the UE is changing RAN. For the one-hour session, the UE shows 12 cell towers used. There is a second session that only shows 2 cell towers used over the same period of time.
Is data session #2 idle therefore does not do an update? And is the first session very active while driving?
What is triggering the update?
How does increasing mimo order in enb cell helps with capacity increase in the LTE cell. means when we go from 2x2 mimo to 4x4 to 8x8?
I want to estimate TOA and CFO of LTE signal. TOA is estimated for oversampled signal. after TOA estimation, the CFO is estimated. Please let me know, why the estimated CFO is wrong?
Hi
Hope you are well.
Can you please share your code for D2D implementation in Matlab.
I want to implement D2D in Matlab based Vienna simulator and struggling to deploy D2D.
Thanks
Hello, I am a beginner in ns3.
I would like to add an energy module to the UE node to check the battery changes of the UEs.
By the way, it seems that ns3 only supports wifi energy module.
Does ns3 have no energy module for LTE?
If yes, is there a way to use the energy module in LTE?
Best Regards,
Data such as L7 protocols used by the UE s and the resources used (forward bytes and backward bytes)
Data should preferably look like this:
Timestamp(in milliseconds) | Type of application| Protocol used | forward and backward resource blocks used or no of bytes for the whole transmission.
I tried kaggle and other open resources available but I was not successful in getting data for this use case.
I'm reviewing charging data records (CDRs) from a mobile carrier for one user's cell phone and I notice that for many of the voice calls, there is an associated data record that starts about 1 second before the voice call and is torn down at the end of the voice call.
Is this a feature of an LTE or 4G voice call? What is the purpose of this data channel. It appears to use approximately 40 K of data in the uplink and downlink directions for each call.
Thank you.
I would like to do some experiments while applying packet loss. However, I want to follow some well-referenced packet loss settings, which consider to some extent different networks' conditions. For instance, 1% packet loss in the mobile network in driving use case (where end-user is videoconferencing or playing a video ..etc), 3% in a congested network while the is playing an online game, ..etc.
We are interested in packet losses caused at layer 3-4 (i.e., to implement erasure channels) but we are looking for the typical packet loss percentage for a variety of use cases.
My question: Is there any paper or article with justification for typical packet settings align with use cases?
I am going to make a security protocol (for authentication) and I found AVISPA is good choice)..but also I am going to see the overload of my protocol on LTE network when I use my security protocol. with MATLAB how can I check my protocol overload in LTE network..
Many thanks
Hello there!
I'm currently doing my BSc project which requires codes on an LTE link whether it is a simulation or a code of real implementation
But I really can't find any good resources on this not a book or paper or anything else
I would really appreciate your help because I'm in an urgent need of this
Outdoor obstacles, such as buildings, buses, and trees, interfere with radio signal
propagation by contributing fading and shadowing and path loss effects. I want to evaluate the impact of these obstacles in the LTE communication Channel. So I need help to model the effects of the obstacle using Matlab. Thank you
I need reference papers (recent) on the various LTE traffic types and their usage distribution. My aim is to obtain a reference like this: VoIP =xy%, FTP =j%, Video =z%...
The data should contain QCI and QoS related parameters and I am using it for designing Network slices, where can I find data like this?
with a well known fact that the success of the 5G technology vastly depends on how well it fared in indoor venues, with analysts claiming that more than 80% of mobile use and tearric originates indoors, what are the available specialized solutions that can make the technology succeed indoors, while keeping both its QoS possible health hazards in check.
In this case, CFO and TOA are not independent variable. Transmitter send several packet. In the receiver side, first the TOA of each packet is estimated then the CFO of received signal is estimated. So, the error in TOA estimation result in error in CFO estimation.
Hi all,
I have two questions about random access procedures of NB-IoT
In NB-IoT, Nprach-Periodicity is the amount of time between two consecutive random access transmission attempts.
Random Access procedures include preamble (Msg1) and its repetitions, random access response RAR (Msg2), Msg3, and contention resolution (Msg4). Following that, actual data transmission starts.
Question is:
1- Do all of these steps happen within one Nprach_Periodicity? (such that UE can start another transmission immediately in the next periodicity)
3GPP standards were not clear enough about these points and they didn't show exactly the time elapsed in Msg3, Msg4, and actual data transmission.
2- When a collision happens in Msg1 transmission (2 UE choose the same subcarrier), collided UEs have to choose a backoff time in the range of [0 Backoff]. There are several Backoff values and eNB can inform the UE which Backoff value it will choose. How does the eNB identify the suitable Backoff value in each collision case? is it affected by the number of failed attempts?
Is there differents between 5G and orhers ICT infrastrure in terms of smart grid applications? For example, is there any different between 5G and LTE?
In LTE, is is possible to perform different modulation schemes on different resource elements in a resource block of a user?
It would be appreciated if you supply me with helpful articles or books.
thanks in advance
Hello everyone,
I'm trying to get LTE RSRP, RSSI, RSRQ, SINR, equations and values using Matlab code, however the values don't make sense for me since RSRQ value is around 75 while the standard range is -19.5dB(bad) to -3dB (good).
Please your kind support and guidance.
The power amplifier is a 3 stage device and the output fluctuation is probably due to the final stage only as we have tested them individually. But how to solve the problem is the basic question which can be answered if we know what causes it. I doubt that there is a grounding problem, and tried some things, but was unsuccessful.
I am in need of LTE and Wi-Fi Coexistence Algorithm. If any one is having it and or can give any reference then please share it with me.
Thanks.
For higher accuracy, should not the LTE Path Loss Models take into account every single resource block frequency? The path loss will be different for each resource block (RB). For example, if I am operating at 1900 MHz with a 20 MHz channel, there would be 100 RBs (each is a different sub-carrier). Surely the path loss cannot be the same over all the subcarriers.
For higher accuracy, should not the LTE Path Loss Models take into account every single resource block frequency? The path loss will be different for each resource block (RB). For example, if I am operating at 1900 MHz with a 20 MHz channel, there would be 100 RBs (each is a different sub-carrier). Surely the path loss cannot be the same over all the subcarriers.
There are multiple modulation options available for LTE resource block allocation (BPSK, QPSK, 16-QAM, 64-QAM, 256-QAM) depending on channel conditions. This modulation scheme is applied to all the resource blocks allocated to the user; in other words, each subcarrier in each of the allocated resource blocks are modulated using one of the above-mentioned schemes.
This means, if 16-QAM was chosen, then subcarrier 1 in resource block 1 will be modulated with 16-QAM, subcarrier 2 of resource block 1 also with 16-QAM, and so on for all the subcarriers within that resource block and all the other resource blocks assigned to the user.
Furthermore, two users can be allocated the same resource block over two different time slots (on the frequency-time grid) with distinct modulation schemes for each user.
This may seem to be a rather simple question, but my adviser does not seem convinced that the above understanding is completely valid.
There are multiple modulation options available for LTE resource block allocation (BPSK, QPSK, 16-QAM, 64-QAM, 256-QAM) depending on channel conditions. This modulation scheme is applied to all the resource blocks allocated to the user; in other words, each subcarrier in each of the allocated resource blocks are modulated using one of the above-mentioned schemes.
This means, if 16-QAM was chosen, then subcarrier 1 in resource block 1 will be modulated with 16-QAM, subcarrier 2 of resource block 1 also with 16-QAM, and so on for all the subcarriers within that resource block and all the other resource blocks assigned to the user.
Furthermore, two users can be allocated the same resource block over two different time slots (on the frequency-time grid) with distinct modulation schemes for each user.
This may seem to be a rather simple question, but my adviser does not seem convinced that the above understanding is completely valid.
While building private LTE networks for industries/mines, how will they connect to Internet? Is it obligatory to connect via an mobile operator or what are the options we have to connect to internet break out?
According to 5G NR Frame Structure, the frame is divided into sub-frames and each sub-frame is divided into slots and each slot has 14 OFDM symbols (for a normal cyclic prefix). Number of slots per sub-frame varies according to numerology. There are multiple OFDM numerologies based on the bandwidth available.
Now my question is, With this flexibility in deciding the slots per sub-frame, How the numerology is changing? Like is it changing OFDM symbol-wise or Slot-wise or Sub-frame wise or is there anything else?
Thanks in Advance
The Definition of RSSI is 'Total received wide-band power by UE'
1. I have confusion in understanding that what is meant by wide-band here.
My understanding is as follows
- In case if Carrier Bandwidth of LTE channel is 10 MHz,total bandwidth is 10 MHz and hence RSSI is calculated for all the all the Resource blocks i.e. 50 RB
- Each RB has 12 subcarriers. Hence for 10 MHz channel, 50 RB are dedicated => 12x50 = 60 subchannels
- Finally (Assuming same Pt for all subcarrier) Pr = Pt*(c/4*pi*d*F*)2 . What is the F should I put and is this the right way to calculate the Pr? In my opinions F = 15KhZ. Where Pr is Received Power per subcarrier by UE. (Using
- Finally RSSI = Pr*12*50 [ or dBm 10*log10(Pr) + 10*log10(12*50) ]
2. Or RSSI is calculated in any different way in simulation / How can I make simulation model to calculate RSSI. Do I have to make a model with a complete RB with all Resource Elements (RE). If yes, then do I need to do the scheduling as well?
3. How can I calculate SINR with RSSI?
Thanks
Shan
I am new to in NS3. I am working on LTE networks. I was looking for calculation of Load information of each cell and exchange of the same between eNBs (over X2 interface/ based handover). Could any one please help me with pointers to code, or if you already have the code snippet.
I did search through the docs. But still needed some additional help. Could you please help me.
Is it possible to calculate the number of RBs allocated to the UE in NS3?.
Thank you.
I wonder which simulator is more suitable for implementing routing protocols in D2D communication in LTE/ LTE-A network?
Most of the simulators are focused in link-level (e.g VIENNA and other Matlab based simulators)
I need a system-level simulator for my research, and it seems the top alternatives are SimuLTE from OMNET++ and LENA from NS3.
Thanks for sharing your experiences
Kindest wishes
Why 3GPP combined S-GW and P-GW in LTE to one component SGW-PGW in 5G mobile network?
The figure shows the Serving Gateway and PDN Gateway of LTE. These two components have been combined in 5G for a reason related to IP impairments and signaling issues.
Any idea or link that clarifies this point?
I was reading this paper "Ultra-Reliable Low Latency Cellular Networks:
Use Cases, Challenges and Approaches" but I couldn't find the exact answer. Which causes the delay? Is it the Serving GW or PDN GW, and why?
When Carrier Aggregation and cross-carrier scheduling are applied in LTE-Advanced system, UE may support multiple Component Carriers (CC) and control information on one CC can allocate radio resource on another CC. Search space of all CCs and control information is only transmitted from a chosen CC. In this case, if search spaces of different CCs are not properly defined, high blocking probability of control information will be very harmful to system performance.
My Question is: What is the cause of this blocking, is it deficiency of control channel elements to served, scheduled UE or what?
My guess is not but I have no proof of this. Can any expert help?
For now, I assume either self-overlapping or high mutual over-lapping of the UEs' search spaces as the likely cause of blocking.
When Carrier Aggregation and cross-carrier scheduling are applied in LTE-Advanced system, UE may support multiple Component Carriers (CC) and control information on one CC can allocate radio resource on another CC.
search space of all CCs and control information is only transmitted from a chosen CC. In this case, if search spaces of different CCs are not properly defined, high blocking probability of control information will be very harmful to system performance.
My Question is: What is the cause of this blocking, is it deficiency of control channel elements to served the scheduled UE or what?
I guess is not but I have no prove for this. any expert can help?
for now, I assume either self-overlapping or high mutual over-lapping of the UEs' search spaces might be the cause of blocking.
I have gone through 3GPP TS-38.211, 212, & 213 of the Downlink part I couldn't able to gather the info of the Number of NR-PDCCH blind decoding calculation. If anyone has understood that kindly explain the calculation.
I read somewhere that 23 dBm power is used for uplink transmit in LTE. I wanted to understand that is this power to transmit on the whole channel or is it for few bits? also I wanted to know how much power would be required to transmit a Sounding Reference Signal in uplink direction?
According to 3GPP standards "3GPP TS 36.211 version 13.2.0 Release 13" in section 10.1.6:
"The preamble consisting of 4 symbol groups transmitted without gaps shall be transmitted NPRACHrep N times"
Does the mentioned "without gaps" means only for one preamble itself, it should be transmitted without gaps? (all 4 symbol groups one after another without time gaps)
Or does it means all repetitions should be transmitted one after another without any gaps in between?
This question arises as many papers consider the second case (no time gaps between all repetitions). However, this is against the standards which mentioned clearly in the same section:
"The transmission of a random access preamble, if triggered by the MAC layer, is restricted to certain time and frequency resources.
NPRACH transmission can start only NNPRACHstart ⋅30720T time units after the start of a radio frame fulfilling nf mod( NNPRACHperiod /10)=0"
This means that NPRACH happens only in certain frames fulfilling mentioned conditions. Thus, preamble repetitions can be transmitted only in those frames. which means it can't be transmitted continuously without gaps because we can never have two consequent frames which fulfil the condition.
The attached figure shows which frames in time domain do have NPRACH.
Kindly, if my previous notation is wrong and repetitions should be transmitted continuously without gaps can you explain to me how that comply with the standards especially with "NNprachPeriodicity"?
I'm working on a project where I've decreased the transmission of SRS (Sounding Reference Signal) which would vacate 6 REs in a subframe. So, how much will much will this increase our throughput?
Regards,
Ruzat Ullah
I have used a LTE simulator to get a selected resource block occupancy state, and I have used the output as an train and test data to predict idle or busy state of selected RB using MLP. but in order to check my results, I need to simulate primary user traffic on a channel which following Poisson process and also the ON/OFF time of channel should be drawn from geometric distribution. does anyone have a Matlab code simulating this traffic model?
I need details on how RSRP,Velocity and TTT is calculated in LTE Handover
Physical layer (PHY) authentication based on the unique channel properties has proven effective due to its simple, fast, and distributed procedures with the recent explosion of the Internet of things (IoT) devices. Authentication is the process of verifying identity claims. If an intruder collects confidential information on wireless links and sends a forged signal, it can cause serious problems. Through an authentication protocol, which verifies the identities of both parties over the wireless link and then establishes a common secret key between them, these threats can be reduced or even completely eliminated. Consistently the conventional cryptographic security mechanisms have been used to prevent intelligent attacks (e.g., impersonation attack). Although the effectiveness of these cryptography-based authentication mechanisms have been proven, proper key management and distribution which may cause large signaling overhead in a large scale network (e.g., Internet of things (IoT)) are required.
Papers:
J. Choi, ”A coding approach with key-channel randomization for physical-layer authentication,” IEEE Trans. Information Forensics and Security, vol. 14, pp. 175-185, Jan 2019.
X. Wu, Z. Yang, C. Ling, and X-G. Xia, ”Artificial-noise-aided physical layer phase challenge-response authentication for practical OFDM transmission,” IEEE Transactions on Wireless Communications, vol. 15, no. 10, pp. 6611-6625, Oct. 2016.
X. Wu, and Z. Yang, ”Physical-layer authentication for multi-carrier transmission,” IEEE Communications Letters, vol. 19, no. 1, pp. 74-77, Jan. 2015.
X. Du, D. Shan, K. Zeng, and L. Huie, ”Physical layer challengeresponse authentication in wireless networks with relay,” in Proceedings IEEE INFOCOM, Toronto, ON, Canada, Apr. 2014, pp. 1276-1284.
J. Cao, M. Ma, H. Li, Y. Zhang, and Z. Luo, ”A survey on security aspects for LTE and LTE-A networks,” IEEE Communications Surveys & Tutorials, vol. 16, no. 1, pp. 283-302, 1st Quart., 2014.
Is there any code for eNB (for LTE) that could be run on an OpenWRT Linux to emulate eNB in a test-bed environment?
Please help.
I am trying to simulate a C-RAN system by interfacing cloudsim with an LTE simulator. Any guidance on the matter would be appreciated ! Thanks.
Is it OK to dilute a DNA extract with UltraPure sterile water?
I was trying to measure sample concentration with a Qubit 3.0 Fluorometer , but the readings were too high and I needed to dilute my sample. So, I diluted my DNA extract with UP water and then repeated the Qubit assay. I am wondering if I should have used a LTE buffer instead of the UP water to dilute my samples.
Thank you!
I am analyzing the complexity of resource allocation problems in NOMA systems. The network model is shortly summarized as: a base station (BS) and multiple users (UES), singles for all UEs are superimposed at the BS, and the objective is to allocate at the BS so that optimize the resources. In addition, I am using the network sum-rate maximization is a baseline, which is NP-Hard according to my own finding. However, I am not sure about this (NP-Hardness of the sum-rate maximization problem).
Does anyone have any suggestions for me?
Many thanks in advance.