Science topic

Computer Security and Reliability - Science topic

Explore the latest questions and answers in Computer Security and Reliability, and find Computer Security and Reliability experts.
Questions related to Computer Security and Reliability
  • asked a question related to Computer Security and Reliability
Question
4 answers
In quantum cryptographic protocols, participants typically share both a quantum channel and a classical authenticated channel. Authenticated channels ensure that messages come from legitimate senders and have not been tampered with. However, these channels do not inherently protect against the interception or blocking of messages by an adversary. Blocking or delaying messages in the classical channel is considered an active attack.
Many sources, including the first article in quantum key distribution by Bennett and Brassard, mention that the public channel between participants is only susceptible to passive attacks, not active attacks.
My question is: In quantum cryptographic protocols (such as QKD, QSS, and QPC), can an attacker block or delay messages in the public channel without being detected? If so, wouldn't that compromise the security of many well-established protocols such as the BB84?
  • asked a question related to Computer Security and Reliability
Question
13 answers
Dear Researchers, I perform J48 Algorithm on KDD Test dataset in WEKA. it gives me Anomaly and normal class. if i want to know detail Anomaly attacks by category like Dos, R2L, U2R. how can i do in same dataset?
if someone helps i will be very thankful
Relevant answer
Answer
Also, I'm looking for a solution If you please,
is there any solution for that?
… Read more
  • asked a question related to Computer Security and Reliability
Question
8 answers
This is related to Homomorphic encryption. These three algorithms are used in additive and multiplicative homomorhism. RSA and El gamal is multiplicative and Pallier is additive.Now i want to know what is the time complexity of these algorithms.
Relevant answer
Answer
Want the encryption and decryption time complexity when used by pallier cryptosystem
  • asked a question related to Computer Security and Reliability
Question
3 answers
how can i measure the "average received CNR(carrier-to-noise ratio)" which is given by Mohamed-Slim Alouini and Andren Goldsmith in "capacity of nakagami multipath fading channels". We all know that we can measure the recieved CNR by transmitted CNR- channel loss. Can i calculate the average received CNR in the same way average received CNR= average transmitted CNR-loss? I mean do the average CNR for transmitter and the receiver have the relation that can be used in the received CNR and transmitted CNR? ill appreciate it for ur answers
Relevant answer
Answer
C/N = 20 log 10 [Vc/ Vn]
  • asked a question related to Computer Security and Reliability
Question
4 answers
I am currently working on an application level IDPS (Intrusion Detection and Prevention System) for Cloud Computing which involves machine learning. So i am looking for a huge amount of data. Please let me know if there is a data-set that you know of that i could use in my project. Thanks in advance!
  • asked a question related to Computer Security and Reliability
Question
5 answers
Can anyone please suggest me the Java API if available, and i want to ask that the  API available for CP-ABE i.e., DET-ABE, can be used to implement Kp-ABE.?
Relevant answer
Answer
you can see this exemple for kpabe toolkit in Java using ipbc , in
  • asked a question related to Computer Security and Reliability
Question
4 answers
what are the upcoming challenges for cloud forensics and what are the latest issues related to cloud forensics, and where to find the credentials to solve these issues.
Relevant answer
Answer
I think, cloud forensic, and cloud security in general, have the issues rooted to a large extent in the weaknesses of the current human users' authentication credentials which allow for numerous attack vectors.
This problem existed well before the clouds' initiative was adapted, and got exacerbated with the clouds proliferation.
Really secure human authentication credentials are seemingly easy to design, but usually they do not satisfy criteria of being really non-detectable by attack vectors, non-reusable by intruders, cheap, simple to use, and easy to remember. So that user credentials' technology is to be subordinated to, or commensurate with economic and usability requirements making this problem extremely difficult to resolve, as those criteria used to be perpendicular to each other.
User authentication technology is not the only clouds' security problem but it is a fundamental one in security of clouds and many other domains.  
  • asked a question related to Computer Security and Reliability
Question
4 answers
I'm looking for time profile benchmark for asymmetric pairings such as Ate, Optimal-Ate in BN, KSS and BLS curves. There are several implementations of pairing library i.e. PBC, tepla, MIRACL, relic. But I am looking for the pairing implementation which the fastest till date achieved in any environment for the standard parameter.
Parameters suggested before the new exNFS algorithm is proposed by Kim et al in CRYPTO2016. 
Relevant answer
Answer
I believe the fastest software implementation ever published was our "Realm of the pairings" paper on SAC 2013, source code included on RELIC. The fastest hardware implementation appears to be Fan et al. using the RNS representation for field arithmetic.
  • asked a question related to Computer Security and Reliability
Question
3 answers
Image formats like jpeg, gif, tiff, etc do they have any inherent encryption algorithm...like jpeg is itself a compression technique..
Relevant answer
Answer
No, the video standards do not have built in encryption, and compression is not encryption (easy to decompress, view etc.).
Encryption of files/ videos typically has a private shared key that prevents viewing the file not like compression that is a public transformation.
See:
See some commercial sites:
  • asked a question related to Computer Security and Reliability
Question
5 answers
Dear!
Please give me information about the performance analysis of he ECC Encryption and Decryption for a given text data.
Relevant answer
Answer
Maybe the comparative performance analysis about ECC-technology in the paper at https://www.researchgate.net/publication/310049571_Secure_Whitelisting_of_Instant_Messages might help you.
  • asked a question related to Computer Security and Reliability
Question
3 answers
I am implementing QSS based signal considering 10 no of sensors, 5 frames and every frame consist of 512 QSS signals. after performing processing i get Co-variance matrix R as shown in the attachment of dimension 10by10 and 100by1 after applying Vec function. where Y is the final matrix. after applying SVD on Y it give me 100by100 noise space vector. I want to apply MUSIC but i couldnt able to generate AB(as shown in attachment at the end) of Dim 100by 1081(where 1081 can be generated by range=-90:0.1:90). How i can generate AB using Vec or Khatri Rao based method??? 
  • asked a question related to Computer Security and Reliability
Question
5 answers
How the symmetric encryption can be taken to the cloud security and big data security? 
Relevant answer
Answer
Cloud is mainly used to store large amount of data and to for analysis on the stored data. Security of data and information while in transit over an unsecured medium like internet is critical for every application. Due to the volume of data, it is time consuming to implement asymmetric encryption for communication. Symmetric encryprion is more faster than asymmetric. But both parties in communication should be aware of the symmetric key. So we make use of asymmetric encryption to secuely transfer the symmetric key. This key is  used for encrypting the further communications.
  • asked a question related to Computer Security and Reliability
Question
3 answers
Are there open data sets available for covert channel experiments? or otherwise are there tools to simulate and capture the data?
Relevant answer
Answer
Rajamenakhsi, Rabat, Muneer, yes we can collect packet traces using different simulators such as wireshark, gns3 and many other simulator tools but whether it is significant to collect the traces for traffic management? If we will collect the traffic traces, it will be only for that environment and that time period. E.g. traces collected during regular class time, vacation and exam time will in an university differ significantly.Please suggest some idea for real time traffic management. 
  • asked a question related to Computer Security and Reliability
Question
4 answers
i need some security cloud Lab to do experiments using XACML framework and an extended XACML framework and doing case study.
Relevant answer
Answer
Dear Sara,
I am not expert in security but u can contact to Dr Rajkumar Buyya from University of Melbourne. This university have a Cloud lab. You can contact to Dr Buyya. If you have any doubt to cnnect him. Kindly fallow the http://www.buyya.com/ . Hi will definitely guide you
  • asked a question related to Computer Security and Reliability
Question
4 answers
What policies can we use to stop one or more Service Providers from assembling users attributes sent to them to process resources for the users and then  preventing them harm?
Relevant answer
Answer
Another approach is to obscure key user data when sent to the cloud provider so that even if the provider has the data .   The Bluecoat Cloud Protection Gateway (link to data sheet below) substitutes randomly generated tokens for key pieces of user data when users user it to interact with Cloud SaaS service providers.  For example, the name Jeff might be substituted for abc123 when stored at a Service provider, but the name abc123 comes back (usually through a web page), it gets translated back to Jeff.  This is done transparently to the end user.
  • asked a question related to Computer Security and Reliability
Question
5 answers
I am aimed to study behaviour of honeypot attack on SSH server. Is there any simulation setup or tool to carry out this experiment?
Relevant answer
Answer
Hi, it depends  on the honey pot you are using. In general if you are on  linux platform, you can analyse the server logs (secure.log, message.log) and the output of simple tool such as last and lastb. The annalysis is not difficult with some minor sh/bash or python scripting skills.
The is also possibility to capture all ssh traffic and store it in PCAP (tcpdump on linux) format for further analysis (Wireshark).
  • asked a question related to Computer Security and Reliability
Question
5 answers
Consider the scenario, that we have a Lookup table (LUT) which takes the desired value as input and output correspondingly the matched value in LUT. We don't want anyone to look into the LUT; hence it is embedded into an executable file [.exe] ( from matlab standalone application Or Visual Studio executable file etc Or any other API development kit) which can be referred as a black box.
How to prove its security in-terms of an adversary querying this .exe file for input and getting the response. The .exe file will output result if the input exists in LUT and the corresponding entry is matched.
Actually I am not good at defining security proofs or games. As a matter of fact i have searched too much papers regarding this to have an idea but all in vain. Moreover, I even have a doubt whether can I proceed with the concept of encrypting/hiding LUT by making an .exe file from it.
Your sincere comments will be appreciated.
Relevant answer
Answer
@ Rivalino, Nils Ulltveit-Moe, Peter Breuer
Sir, let me make things more clear. 
Suppose a person makes a C/C++ based code and does not want people to see the contents of the code. He proceeds by making an .exe (executable) file and gives it others. Clients can run it only by querying it and getting output without knowing anything else about the underlying code.
while Querying; client can only input  Digital Signature (eg. RSA) signed by some Authentic authority's private key for his identity. The signature is verified by public key of authority (embedded into) .exe file. If signature is verified the output will be returned.
So, the restriction is that a client can only input Digital signature (DS)  for his identity to the .exe file. and he will get the output if DS is verified.  For inputs other than the DS no output is returned. A query of even 100 times by a same DS will result in a same output. 
My question is that ; Are .exe files (standalone files) are secure enough for this purpose or not?  What about the query based attacks  or any other sort of attacks that are possible on this exe to make it not suitable for use in this scenario.
Moreover, If I am going in the wrong direction, what are your possible suggestions to me? 
Obviously we can't claim anything to be secure even if its proven secure today, may be it won't be secure by tomorrow.
  • asked a question related to Computer Security and Reliability
Question
7 answers
 I want to classify data based on their importance, and secure them based on that classification. My question, is there a difference between data importance and their security level. To me, the more important data become , the more security level they need, so the relation between data's importance and their level of security is direct relationship. Is there a way that data could be important but do not need to be secure. In other words, can the relationship between data importance and its security level become separate of each other in the concept of security area. e.g : (very important but insecure). Please note I am talking about the concept, not technically. If the answer yes, Can you please give an example?
Thank you for your cooperation.
Relevant answer
Answer
Hi Sultan,
I think in this case you need to differentiate between  different security features. For example: Encryption, integrity and authentication. Some data might be very important but it does not necessary mean that it should be encrypted. Instead, it needs another security feature: Authentication.
Imagine your WSN is measuring temperature and sending this information to another entity. In this case, the information about the temperature is very important, but in most cases it does not need to be encrypted (there is no problem if someone reads the data about temperature from your network). However, it is very important that the data is not modified on its way. That is, the integrity of the data is important. Also, it is very important to know where the measurement of temperature comes from: That is, the data needs to be authenticated.
To summarise, if you consider 'secrecy' as the measure of security of your data, the temperature will be important but can be insecure.
Hope it helped.
Jhordany
  • asked a question related to Computer Security and Reliability
Question
5 answers
Salem,
In a wrapper feature selection algorithm, we use the accuracy rate to evaluate the candidate feature subsets during the search process.
How can we show that a wrapper feature selection algorithm hasn't overfitting.
can we use the final selected features (feature subset) to calculate the accuracy on the training set and the validating set, and then compared the two obtained accuracies. According to how much closer the two accuracies are, we decide about the overfitting.
Thanks.
Relevant answer
Answer
Thanks for your help.
  • asked a question related to Computer Security and Reliability
Question
6 answers
I want to write a module for fault injection in cloudsim, I want to generate failures according to statistical distribution. My question is how I can make a resource failed? I mean by resources (  CPU, RAM, network, VM, VMM... etc)
I want to do that in order to measure a failure rate and other reliability metrics of each resource. 
Thank you in advance, 
Relevant answer
Answer
It is very simple to make a cloud or virtual machine as a faulty. You need to go through cloudsim tool and read the FaultTolerantExample.java file in depth and you can easily find your answer, Even though i am still working to make an algorithm for fault tolerant.
  • asked a question related to Computer Security and Reliability
Question
3 answers
I need to know about any relevant papers/stuff or and research carried out in the field of cryptography / security on an encrypted non-tamper-able Look up Table. The table can be considered as a black box given input, corresponding to the mapped entries it will return the output. Also, black box is in access of an adversary to make queries of its choice in a security game.
I have searched and worked out on many possible solutions but still all in vain, none of them provides both security and functionality.
In matlab and visual C++ we can built standalone (executable file) given input it will return output with no one knowing about whats embedded in the black box. Any one can query this black box with input without knowing the inside functionality. But I have read on their developer forums that executable files created by them are not secure. Any one can break it, moreover we cant prove the security of these sort of standalone exe's.
I have read papers on predicate / searchable encryption for making up an encrypted look up table and then querying it with token's; but this concept does not work under all conditions and the involved computations are too much. 
Any other who came across papers addressing this issue or a generic approach to tackle this if I missed something pertaining to it.
Relevant answer
Answer
  • asked a question related to Computer Security and Reliability
Question
3 answers
I followed this link :
 i can print position x,y in the file like the link but when i added the code for print energy in the  file i got the segmentation fault error.
how should i overcome this problem?
with best regards
Relevant answer
Answer
thanks so much, but i got the same error ....there is difference , i use CRCN patch too and need print energy in this scenario .
best regards of you
  • asked a question related to Computer Security and Reliability
Question
9 answers
I'm looking for a big data that is generated from IoT system and analyze it with Hadoop. I tried to find online some IoT sample big data or API that allow me to retrieve IoT data but I could not find any. If you know any, please let me know. So I think the I'm only left with one option which is to simulate an IoT and let it run to create dummy data.
Thank you anyway for your help
Relevant answer
Answer
Sensor Cloud
Dear, check this link:
  • asked a question related to Computer Security and Reliability
Question
5 answers
What kind of features do i need to consider in windows API calls that can help in detecting the behavior of malware using data mining techniques.
And how can that data be collected.
  • asked a question related to Computer Security and Reliability
Question
3 answers
experimental realization of quantum teleportation technologies has attracted lots of interests from different fields in applied sciences however there exists different physical implementation on these teleportation technologies i.e solid-state, photonic qubits, trapped ions etc with their own mechanisms advantages and disadvantages.
now the question arrises are there any generic frameworks that can be used as the basis for implementing quantum teleportation infrastructures?
one of the greatest attraction on teleportation is the realization of secure communication, improved transmission speed, etc
How will someone go about developing such a generic framework and on what basis?
Relevant answer
Answer
Dear Themba.
Quantum teleportation allows a sender to share information with a receiver by encoding it into a quantum state that both parties can access.
This link may be useful for your work :
  • asked a question related to Computer Security and Reliability
Question
8 answers
I am working cloud intrusion detection system. in the beginning i used KDDcup darpa datset, but nowadays when am reading some journal they used CIDD dataset, but i am not getting that dataset anywhere, kindly help me to get the CIDD dataset. 
Relevant answer
Answer
KDD99 should not be used for any real research , it is heavily skewed data. Try more realistic sources of data like www.secrepo.com https://stratosphereips.org/category/dataset.html
  • asked a question related to Computer Security and Reliability
Question
3 answers
Hello to everyone and thank you for your time,
How should I count complexity on a (soft) real - time steganography algorithm? I need a small comlexity analysis. What should I include? No tables neccessery.
Thank you again in advance
Relevant answer
Answer
Dear Nikolaos:
   there are many books on Computational Complexity. I'm not sure what do you want.
As long as I remember, you have to traverse the M x N bytes of the image, making some bit manipulation. In this sense, and assuming bit manipulation can be done in constant time, then encoding would be quadratic in the size of the image O (n^2).
A similar analysis can be done for decryption of the message.
Is this what you want?
David
  • asked a question related to Computer Security and Reliability
Question
3 answers
Medical records security
Relevant answer
Answer
Electronic Medical records are sensitive pieces of information that requires extensive and strong protection. Taking an example of a bankrupt hospital may put some light on the matter. A cancer hospital dealing with all stages of cancer patients filed for bankrupcy and was taken over by a loan lending bank. After the take over was complete, the bank officials scrutinized all such patient details who were in their last stages of cancer, had extremely less chances of survival and had pending loans with the same bank. The loan accounts for these patients was closed and the loan amounts forfeited. As a solution, one could use steganographic protocols to hide patient medical records (which are primarily text) within medical report images. In this case, only an authorized doctor or medical personnel may have access to the records. To all others, the existence of the records remains concealed.
Hope this helps a bit
  • asked a question related to Computer Security and Reliability
Question
8 answers
iam working in searching encrypted data in cloud computing and i want to combine attribute based encryption with searchable encryption , i search for java library implementing ciphertext policy attribute based encryption.
Relevant answer
  • asked a question related to Computer Security and Reliability
Question
10 answers
Is it still the DES algorithm used in banking/ financial transaction systems (in spite of the weak key issue)? Or  whether it is the ECC algorithm that has replaced the DES? I believed that may be ECC is in use but came to know that it is still DES! Can someone, clarify what algos are actually in use and why.
Relevant answer
Answer
Don't mix ECC and DES they are two completely different things. DES, as well as AES (the new standard) are symmetric algorithms this means that there is only one key which is used for both encryption and decryption (like your door). On the other hand, you have public key encryption where you have two keys, one for encryption (public) and a different one, for decryption. Such algorithms are RSA and ECC. 
The most common set for transactions (not only financial) is AES, RSA and SHA (for hashing).
  • asked a question related to Computer Security and Reliability
Question
5 answers
Given a set of twenty-three (23) attributes:
userID, email, common name, surname, givenName, Global username, displayName, Organization Name, Organizational Unit, home organization, institution type, affiliation type, affliationTypeScope, PrimaryAffiliation, Entitlement, PersonAssurance, TargetId, Legalname, sharedToken, AffiliationType, Postal Address, TelephoneNumber, MobileNumber.
How many of these attributes are sufficient enough to identify an individual in cloud environment?
Relevant answer
Answer
@Maria Abur
If there is an UserID then it is supposed to be unique and should be sufficient to identify any user, is not it?
Do u want to know what other alternate keys / key combinations exit?
  • asked a question related to Computer Security and Reliability
Question
6 answers
I am wondering weather asymmetric encryption such as Elliptic Curve Cryptography add extra bits to the packets through the transmission. I know it add many bits and might at least 1024 bits to deliver the keys, but what about after that? I mean what about the rest of packets for normal communication after delivering the keys, do they also have to carry extra bits too?
Thanks guys for your participation.
Relevant answer
Answer
Good question. I refer you to this article, which describes the asymmetric key approach, and what computations are necessary:
You will note, in 15.1.3 that the encryption step involves exponentials, relatively more CPU intensive than XOR commonly used in symmetric key encryption.
So, yes, asymmetric key is more computationally intensive than symmetric key. And furthermore, symmetric key block ciphers are "less efficient" than symmetric key stream ciphers. If for no other reason, you need to create blocks, which often involves padding the plaintext. With stream ciphers, the plaintext and the ciphertext can be identical in length (for instance, if you use XOR as one would typically do).
You will read many times in the literature something that makes a lot of sense intuitively. If you want to send encrypted data in bulk, and as fast as possible, your best bet is to use symmetric key encryption (and even better, a stream cipher). In order to distribute the necessary SECRET keys, for the symmetric key protocol, you would use an asymmetric key protocol. But this is needed infrequently, as a key exchange protocol. Let's say, once a day. Something of this nature. The frequency will depend on the update rate of those encrypted messages and the encryption algorithm.
In both methods, there will be some overhead involved in distributing the public key or the private key, depending whether you're using asymmetric or symmetric key. But if we are talking about bulk data encryption, that overhead would be minimal, in the greater scheme of things.
  • asked a question related to Computer Security and Reliability
Question
14 answers
How much power does the encrypted packet consume comparing to unencrypted packet?
I know it depends on the encryption algorithm, but you may mention the type of encryption alongside with the power consumption. What I would like to know is both the power consumed by an unencrypted packet and an encrypted packet. so I can make comparison.
I went through many papers to figure this out, but they just mention the power consumed by encryption without mentioning how much the unencrypted packet consume ?
I need to know weather the encryption add significant overhead to the overall consumption, either symmetric or asymmetric , and might the only way is to compare the power consumed with and without encryption in numbers.
Thank you so much guys for your cooperation in advance.
Relevant answer
Answer
I think this varies greatly, depending on the type of crypto algorithm and whether it's done in hardware or software. This is an interesting piece:
Note that part about how the crypto key was reconstructed for RSA, by measuring power requirement (so-called "side channel attack").
Symmetric key stream ciphers should be the least power hungry of all, and they also do not increase the length of the message. My bet is that one can devise a symmetric key stream cipher that would require a negligible amount of extra power. I haven't actually tried measuring this, but it should be straightforward enough to run your own experiment.
So, one technique you can use to save computing time and energy, use an asymmetric key cipher to transmit the secret key, then use a symmetric key cipher for the messaging itself.
A long post, considering I didn't answer the question.
  • asked a question related to Computer Security and Reliability
Question
3 answers
  1. nabeel
Relevant answer
Answer
provable data possession(PDP)is an algorithm for proving data integrity in  cloud computing
  • asked a question related to Computer Security and Reliability
Question
4 answers
Is PCA a suitable mechanism for fingerprint recognition?
Is minutiae based mechanism prefered over PCA for fingerprint recognition?
Relevant answer
Answer
Did you mean 2D DWT along with PCA is more efficient??
  • asked a question related to Computer Security and Reliability
Question
5 answers
I am working in NS2 and I want to to encryption and decryption in NS2. So how it can be implemented?
Relevant answer
Answer
Other question: ns2 is a simulator, why would you like to encrypt traffic in the simulation? Perhaps a clever simulation setup can avoid the extra overhead of encryption and still keep the simulation results valid.
  • asked a question related to Computer Security and Reliability
Question
6 answers
Relevant answer
Answer
It usually takes at least two pieces of PII (excluding a social security number, drivers license number, etc) to identify someone.
  • asked a question related to Computer Security and Reliability
Question
3 answers
symmetric key cipher (block or stream cipher) providing partial decryption of data at some points. Although the entire data is not been decrypted but the data decrypted to that specific point should be viewed or accessed.
The partial decryption keys can be inherited form the main symmetric key to do partial decryptions successfully.
Can anyone refer to me any good stuff or papers worked out in this area?
Relevant answer
Answer
Symmetric key decryption needs the key, once you have the key you can decrypt.
Now, you are asking about local decryption (partial one) for this you need to derive "chunk" key and open only that key. You can start from a master key and derive a binary tree of keys E(k,0) and E(k,1) for the left and right side keys and keep on doing it to build a tree of keys, at the leaves are the chunk keys with which you encrypt/ decrypt the data. You can always derive the chunk key specifically and use it to decrypt the chunk (and nothing else) other keys are pseudorandomly independent.
I am not sure if you are after something like this, but this is one possible solution to partial decryption with partial key only. Note that chunk key derivation is logarithmic in the number of chunks. I assume that you can delegate the chunk keys or even complete subtree key (for a sequence of consecutive chunks).
  • asked a question related to Computer Security and Reliability
Question
4 answers
Is there any good paper or research carried out till now that a data is encrypted by a single key or onwer(user) and their are 2 or more decryption keys. One key can decrypt the entire data. Other keys decrypt partial data or a portion of data. A person having no key cannot decrypt the data but those having keys can fully or partially decrypt data based on the keys. ?
Relevant answer
Answer
Good question. Dr. Peter has shown a good method for your problem and Dr.martin also done the same. But if I am not wrong you need a three-keys system. so we have private-key and public key systems. However, you need a nother system whih is more than a public key system. If the same encrypted data decrypted with one key you will get an ansewer. However, if you decrypt the data with the third key the result will be different. And both results are true. One result is complete the second is partial. Such system does not exsist for the best of my knowledge. you can trick it as the given methods. To construct this system, start with RSA and modify its steps. May be you will find interesting results
Good luck
  • asked a question related to Computer Security and Reliability
Question
3 answers
Hello,
I am a research student at the University of Kent, studying usability and performance assessment in Automated Border Control systems. As part of my research, I am hoping to investigate passport interactions and specifically how a user interacts with a passport reader. I am hoping to record the interactions with a program that utilises Microsoft's Kinect for skeletal tracking, analysing how a user positions them self for the presentation.
I am outreaching to institutions that may be able to lend us a passport reader for a period while I collect some data. Ideally, the reader would be able to read RFID chips and would be able to secure the entire document (see http://passportscanner.co.uk/product/brands/suprema-realpass-v/ for example) Please get in touch if you know anyone who can assist.
Many thanks, 
Josh Robertson
Relevant answer
Answer
Yes, with the RFID chip embedded in newer passports, I would expect that a passport reader of the immediate future should use the same technology as a credit card reader, no? And there's more.
When I recently had to renew my passport, I had the option, for a few extra $, of also receiving something called a "passport card." So, why not? Send me one!
It's only usable in a handful of border crossings, so far at least, but the card looks much like a credit card, with the chip, although the graphic design is much like the old fashioned passport picture page.
Which leads to wonder, why are we bothering with paper passports anymore? Whatever information is needed, including your individual visas and your past border crossings, is far better stored "in the cloud," than in a paper booklet, I would think.
A verbose response to say, in my opinion, rather than spending a lot of effort to scan and read/decode paper passports, it might be more fruitful to migrate to an electronic system.
  • asked a question related to Computer Security and Reliability
Question
16 answers
i want to work in cloud computing as a part of my research work. seeking an advise on the research area in searchable encryption in cloud computing.
Relevant answer
Answer
i have written an open source library , but the scheme is pretty old , feel free to check it out 
For better understanding of the current schemes , Seny Kamara writes excellent blogs on this topic , they are very good tutorials 
  • asked a question related to Computer Security and Reliability
Question
3 answers
Is there any scope of research in this field?
Relevant answer
Answer
Hi Navjyotsinh
This research field is active, and has been for many years. It was originally proposed by Goldrich and Ostrovsky many years before Cloud came about. With modern hardware in HPC situations, it offers some interesting peformance gains and good obfuscation of what is going on in the system. I have added a list of a few papaers from my research collection to get you started.
Regards
Bob
enc
  • asked a question related to Computer Security and Reliability
Question
3 answers
if possible, please point me to any published research papers on the subject
Relevant answer
Answer
Accidental configuration errors (non-malicious insider attacks) are easier to protect against, although if you have a mechanism to protect against the malicious attack, you've solved both problems.
The best example I can come up with on the spur of the moment would be to protect routing protocol messages, in an IP network.
Take a look at RFC 2453, for Routing Information Protocol (RIP) Version 2.
Section 4.1. describes a simple authentication scheme. In the packet overhead, one includes a plaintext, 16-byte password. This is a simple defense against misconfiguration. A router would check the password of any routing message, to verify the message comes from the router identified in the layer 3 header.
Next, take a look at RFC 4822.
Now you have a variable length encrypted authentication header, where the length depends on the algorithm used. A secret key is now needed to create and then to verify the authentication header. So this defends not only against simple configuration errors, but also against a deliberate routing protocol spoofing attack, designed to disrupt network operation.
  • asked a question related to Computer Security and Reliability
Question
8 answers
I have new authentication method asked user to login based in some features,
how can I calculate the FP  and FN .and is there any tool can help.
Relevant answer
Answer
I have some experience testing biometric authentication products. The risk is always that one jumps too early into ad hoc testing before an experiment design is made. So I agree with the previous answers. make sure the use case is clear and that there are acceptance/success criteria *before* you start testing.
Next to that: the terms to use for authentication are False Accept and False Reject. Those give the False accept rate (FAR) and the False reject rate (FRR). But they are of course the same as FP and FN.
Normally authentication methods that need to be tested, like biometric systems, produce a score for every authentication attempt. (trust score). One of the aspects of the biometric system is that you set a threshold depending on the functional requirements. But that is the actual appication. To measure the performance of an authenticator regardless of the trhreshold setting, you need to create the FAR and FRR curves. Or better the Detection error trade-off curve (DET). 
There is no tool except for R, SPSS and python :-)
About significance (and how many tests you need) is discussed a bit in "Common Criteria - Common Methodology for Information Technology Security Evaluation - Biometric Evaluation Methodology Supplement [BEM]" from British government ( http://www.cesg.gov.uk, site has maintenance issues at the moment of writing)
good luck
  • asked a question related to Computer Security and Reliability
Question
11 answers
Dear all,
I proposed a framework security, but i want to determine exactly which is the best model for  assessment this framework  
Relevant answer
Answer
I suggest that you act as ethical hacker and conduct penetration tests with all instanced objects and applications of your framework.
  • asked a question related to Computer Security and Reliability
Question
4 answers
I am trying to implement the video watermarking based on visual cryptography. I have created two share of a image in '.bmp' formats. How these two shares will be embedding  in the video while using the frequency domain.
I have divided the video into parts ( Suppose the video consisting 300 frames. In this way each part is having 150 frames each) to embed these share in two different parts. I have extracted the Y component of each part of the video. On which I had applied the DWT. Extract the HH Band. Apply SVD. And get the singular matrix.
How could I process the two share of transparencies in order to embed into the singular value obtained of these two parts of video as indicated above. Also requested to please tell me the extraction process of these two transparencies from watermarked video ?  
You are most welcomed to informed me about any other method to implement the visual cryptography based video watermarking.    
Relevant answer
Answer
  • asked a question related to Computer Security and Reliability
Question
5 answers
I want to know how to prevent a peer from changing its identity from time to time?
Relevant answer
Answer
   
combining both cryptographic and analysing network characterstics like peak,burst,thruput wecan develop novel policy to prevent whitewashing in p2p.
  • asked a question related to Computer Security and Reliability
Question
2 answers
Does it involved any scripts?
Relevant answer
Thank you Amit Kumar for your suggestions.
  • asked a question related to Computer Security and Reliability
Question
10 answers
Can anyone tell me what SAN parameters is important to characterise the performance?
Relevant answer
Answer
I think the best storage area network design for an institution will take into consideration a number of critical issues: uptime needs, scalability, security and disaster recovery.
For more information please refer to this link:
  • asked a question related to Computer Security and Reliability
Question
9 answers
I am looking for a tool to simulate the authentication in MCC environment.
I just need to know how to test the security level of communication channels during the session in the mobile cloud environment.
Relevant answer
Answer
Sure, you can take a look for SONORA, MobiCloud, and CloudSim Projects. Also, if you have some of those resources please let me know 
  • asked a question related to Computer Security and Reliability
Question
14 answers
For example there is linked chain of hashes of a document. some of consecutive hashes are deleted how to recover or reconstruct the chain.
Relevant answer
Answer
@ Janos Follat
thanks for pointing at Thomas Page's PhD thesis "The application of hash chains and hash structures to cryptography" 2009. Interesting reading. Obviously there are indeed useful applications of combining hashes in various ways.
My comment about hash chains was directed at the fact that hash chaining reduces the input length of h2...hx from 'unlimited' to the length of h1'th output.
If you chain for example FNV32 Hash functions (please note that the original question did not mention cryptographic secure algorithms) the input length from h2..h7 is 32 bits only. Thus when chaining hash functions with an output length smaller than the first original input, distribution width within a chain will decrease in relation to the original variability.
Furthermore the collision probability of the total chain h1->h2->...hx might increase. Thomas Page referred to this at p.43 & p.44 (3.1.4.3 ff.). If simply chaining hash functions like h1->h2->h2...->hx  security might decrease therefore in relation to their onetime usage.
  • asked a question related to Computer Security and Reliability
Question
10 answers
I am interested to know how the cloud providers trust their new cloud users and what are the factors they considered for trusting the new user.
Relevant answer
Answer
you can use user behavior profiling and decoy technique  
  • asked a question related to Computer Security and Reliability
Question
5 answers
I am interest to find some techniques to protect privacy for sensitive data which is to be published in social media for example.
Relevant answer
Answer
Privacy is often a mis-understood paradigm.  If data was collected, it will be used ... and abused.  Therefore, while collecting data, it is a good practice to make sure it is collected in a way it cannot be co-related without adequate efforts and only by those who can potentially understand what it is.  Also, any extra directly co-relating parameters should strictly be left uncollected.  Otherwise data will be innate with privacy risk.
  • asked a question related to Computer Security and Reliability
Question
3 answers
Since confused deputy problem is a type of privilege escalation, and least privilege principle only gives program the necessary privilege,  could the problem be solved?
If yes, how is least privilege used to solve it; if no, why least privilege cannot solve the problem?
Thanks for your answer in advance.
Relevant answer
Answer
Try using the capabilities-list of C-List.
Since each subject has his or her own C-List, he or she should also send the list to the "deputy" when asking it to do something. This way the "deputy" will know whether or not the subject has the right to do so.
  • asked a question related to Computer Security and Reliability
Question
6 answers
(TEMPEST )
countermeasures against electromagnetic compatibility (EMC) problems
Relevant answer
Answer
  • asked a question related to Computer Security and Reliability
Question
4 answers
This is to understand the possible solutions for autonomic computing algorithm in Maths. At the moment , according to my own experiences , binomial heap really great in this approach compared to others. Bayesian Network , linear regression , N-Gram requires extra parameters in solving false and right alarms. Any suggestion how the algorithm can really inline with K-MAPE will be really grateful. Thanks in advanced.
Relevant answer
Answer
Thanks guys for the great input. 
  • asked a question related to Computer Security and Reliability
Question
3 answers
I want to know the issues around distributed system security and how can be remoduled to give proposed possible things to overcome?
Relevant answer
Answer
I still see resilience as one of the most important security objectives. What does a fully secure system help if it does not work. In case of distributed computing, network outages easily cause split brain situations. Thus, recovering from them is obviously a crucial area of research, especially for real time applications.
  • asked a question related to Computer Security and Reliability
Question
7 answers
I want to know some latest issues in terms of privacy due to data publishing or data mining in real example that happened after 2013. I have some example like
Google bypass safari browser privacy bypass issue
Max Schrems against Facebook.
Please share if you know some more example
Relevant answer
Answer
Could you shade more light on what you mean by "Real time issues of privacy..."? I am assuming you are taking about Privacy in Temporal Information Retrieval. If that is the case, there is a good beginning resource that lists a number of research articles on the subject -http://en.wikipedia.org/wiki/Temporal_information_retrieval 
The challenge would be incorporating various privacy algorithms in temporal information retrieval systems.
  • asked a question related to Computer Security and Reliability
Question
4 answers
I know WEKA or TEDRAD are softwares that are used for association rules mining in Bayesian networks. But I don't know which one is better and if there is better software to do this work?
Relevant answer
Answer
For this type of research I typically turn to R.  For Bayesian networks, you can use the either the 'gRain' library or the 'bnlearn' library.  R also has a fairly impressive collection of other libraries including implementations for neural networks, genetic algorithms and more.
  • asked a question related to Computer Security and Reliability
Question
5 answers
What is the Mechanism of Digital signature in mobile banking? Is it be possible on a sim card of a customer with exchange of keys?
Relevant answer
Answer
Reliance on digital signatures for other than short lived transactions is problematic. There are legal issues as well as technology escrow issues.  For example, if a signature was valid 3 years ago, but the keys used are no longer valid or the technology used to generate the signature is deprecated, how can one convince a court of the veracity of the signature?  
On the technical side, most SIM cards contain only a symmetric key which is used as a shared secret with carriers to generate and respond to challenges as a mechanism to prevent cloning, so the existing keys and algorithms available on existing SIM cards are likely lacking for this purpose if you are really contemplating a cryptographic digital signature. More promising is the study of mobile payment systems currently used in Africa.  Clearly, it seems to be working, though flaws may exist.
Finally, if one looks at the history of certificate issuing authorities, one comes away somewhat depressed since it seems anyone can become an issuing authority and if your trusted certificate list contains any dubious issuing authorities, then that is a problem.
 The only way I have seen this used effectively is when one uses one's own certificate authority to sign the keys of trusted devices and then uses this as part of the connection requirement for the device via a mutually authenticated TLS or SSL connection.  This effectively limits all traffic from other than a device having an appropriately signed X.509 certificate and is an effective deterrent in this respect.  However, one would need to be able to generate and store the key material securely and would need to set up a strategy for generating and installing the certificates.  Symantec and others have cloud services that could facilitate this to scale, and would be worth a look.
  • asked a question related to Computer Security and Reliability
Question
3 answers
Either it is symmetric or asymetric.
Relevant answer
Answer
I confirm the answer: asymmetric encryption.
Symmetric encryption is not really good scalable for large sized and changing network topologies. (n * (n-1) ) / 2 key pairs would be required.
  • asked a question related to Computer Security and Reliability
Question
12 answers
I have a terabyte of pcap files. I would like to find intrusions in those files. How can I  eliminate/reduce unwanted fields or packets in those pcap files ?
Relevant answer
Answer
PacketPig based on Apache Hadoop and Snort is probably your best starting point for now: http://hortonworks.com/blog/big-data-security-part-one-introducing-packetpig/
When you have this up and running, then you can experiment using anomaly based detection methods on top of it.
  • asked a question related to Computer Security and Reliability
Question
7 answers
I have two pcap files, one for background traffic (non-malicious) and second for attack traffic. To make the traffic more realistic, I want to mix up both packet captures and send them using tcpreplay machine to (another) IDS machine for IDS evaluation. I think there is two options to achieve this:
1. to merge these two files into one file, but here I think time-stamp issue arise.
2. to open two tcpreplay command prompts on tcpreplay machine, one for background file and other one for attack file, and both sending from same interface.
Which one is more accepted in research community?
Relevant answer
Answer
It should be possible to do so - shouldn't be any problem opening multiple concurrent tcprelay instances. Alternatively you could take multiple pcap files and merge them using Wireshark, then use a single tcpreply instance to replay the merged file. Please let me know if this helps?
  • asked a question related to Computer Security and Reliability
Question
9 answers
What tools are there currently for securing big data? Kindly suggest if you know any. I am currently researching into tools that can be used to secure both non structured and structured big data. Am looking for possible suggestions and peoples ideas on the different tools. Am currently aware of the Vormetric tool for protecting data at rest, but I need to understand what technology is used to achieve that and how it can be improved further? Plus am not just concerned about only data at rest, but also interested in data of different states.
Relevant answer
Answer
Hi Ighoroje,
There are numerous big data security tools in the market - one of them I know better is HP ArcSight software which can help collect, enrich, search, store & analyze the big data which include:
1) detecting account abuse by insiders & advanced persistent threats (threats which follow a low & slow attack profile that is difficult to distinguish from normal legitimate activity)
2) pinpointing data exfiltration by advanced persistent threats
3) alerting big data user of new program execution
Some of the ArcSight features include: speed, real-time threat evaluation, pattern matching, statistical correlation, monitor and respond etc. Of course, there are other big data security tools as well from IBM, Symantec, BlueCoat etc.  You can do an internet search to find out more.  Following are some links you might want to try.  All the best:
Regards,
Fung
  • asked a question related to Computer Security and Reliability
Question
14 answers
Hi, can someone please share a link about the type of hardware and software vulnerabilities that leads to an attack please?
Relevant answer
Answer
There is no straight forward answer , check OWASP top 10 , SANS Top 10 . Check CWE , CAPEC details on MITRE
  • asked a question related to Computer Security and Reliability
Question
4 answers
With relation to my research work, I would like to know the recent work going on in TCP SYN flood attack.
Relevant answer
Answer
the best way is to use google schoolar or scopus search engine filtered by year
  • asked a question related to Computer Security and Reliability
Question
4 answers
.
Relevant answer
Answer
To be honest I can hardly imagine any connection between your research goal 'testing the trust of people' and your request for 'downloading a container, include a secret and pick a strong password'.
What do you want to evaluate?
If users put in a 'real secret' or just some random data?
How do you want to appraise if the data is a secret or not?
  • asked a question related to Computer Security and Reliability
Question
7 answers
Distributed DDOS defense.
Relevant answer
Answer
One of the many ways I guess companies go about is with SIEM systems that log and track everything that happens on the network. Products like NetWitness, LogLogic etc log and correlate data coming in from various points in the network to report DoS or DDoS attacks. This is of course with respect to regular networks. If you have a deployment of a SDN etc, then approaches such as cSAMP (that do collaborative network monitoring), OpenSketch etc will help you identify and combat a DDoS attack.
Any further explanations will of course need to be substantiated with what use case you are looking at, what your architecture is etc.
  • asked a question related to Computer Security and Reliability
Question
10 answers
I want to know the time complexity and computation complexity involved in various cryptography algorithms used for encryption and decryption. Can anybody provide me a good link or suggest me a good text or IEEE Paper for the same?
Relevant answer
Answer
Handbook of Applied Cryptography
Alfred J. Menezes, Paul C. van Oorschot and Scott A. Vanstone
  • asked a question related to Computer Security and Reliability
Question
10 answers
Is there a list of attributes or checkpoints to evaluate?
Relevant answer
Answer
please consider concepts of iptables in linux to have better understanding. hope it helps.
  • asked a question related to Computer Security and Reliability
Question
1 answer
IBE was first introduced by Adi Shamir on 1984 to provide easier method of public key encryption. Ever since, it has always been an area of research. What are its vulnerabilities? What are the design problems of establishing such a system? Why has it been an open problem for the last 30 years?
Relevant answer
Answer
IBE is affected by key escrow problem, and based on cryptographic primitives making it less performant than other encryption schemes.
  • asked a question related to Computer Security and Reliability
Question
4 answers
I am trying to find a trusted tool for evaluating the performance of cryptographic algorithms which is linux compatible. I tried CIAT but found some compatibility issues. The coding of a tool for NiST specification is too hard and time consuming so can anyone suggest any tools. Thanks in advance.
Relevant answer
Answer
The NIST Statistical Test Suite (STS) is one of the gold standards for PRNG testing, though you do have to compile the binary. You don't, however, have to build your generator into their program. You can feed STS a set of files and have STS analyze them and give the same report as with it's built-in generators. I don't remember having any problems getting STS to build.
I believe I also have a python script to parse the STS output and summarize it or make a csv or something, but it was fairly specific to the experiments I was running.
  • asked a question related to Computer Security and Reliability
Question
4 answers
From Gilbert/Lynch proof of CAP, I did not find the difference between local area and wide area networks for applying CAP. Someone said that transient network problems in wide area setting would be common, but can be eliminated by proper failure detectors in one single data center. However, as I know, failure detectors always need time to discover failures and may has a probability of false suspicion, therefore transient failures in local area networks can only be reduced with failure detectors but always exist. Does anyone know about this?
Relevant answer
Answer
I think i really know what you want to say. In one single data center, we can build enough redundant paths between two nodes, then it can be assumed that there is always a live path. But for wide area network settings, there are no such redundant paths. Thanks.
  • asked a question related to Computer Security and Reliability
Question
4 answers
Memory randomization is a way to protect memory from security attacks, and also a way to avoid false sharing on cache lines. But the common way to change the layout is by adding padding at the expense of losing benefits from cache locality. Do other ways to do memory randomization while still being able to utilize the cache localities exist?
Relevant answer
Answer
Allocate all data structures in units of a multiple of the size of the cache block and you will get no false sharing, provided you separate data that is not genuinely shared.
Rather than randomize memory I would consider encrypting (RW’s post goes that way even if the main drift is off track). Decrypting, while computationally costly, can to some extent be covered by time take for page faults if done right.
  • asked a question related to Computer Security and Reliability
Question
5 answers
Is any research going on in this topic?
Relevant answer
Answer
Actually the Digital Library Of India is providing this facility for Ancient books like Philosophy, Literature etc... But to develop our own Cloud Computing means what to do?
  • asked a question related to Computer Security and Reliability
Question
6 answers
I have heard that it has been proven (theoretically or practically)?
Relevant answer
Answer
Yes it is possible to remove the trusted third party completely. For that purpose only we use public key cryptography system. The session key that is generated by sender is shared securely with receiver using public private key pair.
  • asked a question related to Computer Security and Reliability
Question
1 answer
In non-linear combination stream cipher.
Relevant answer
Answer
There is a simple criterion based on the Walsh Hadamard transform. Check Massey Xiao on google.
  • asked a question related to Computer Security and Reliability
Question
4 answers
In IP-based WSN surveys, they recomand that IPsec must change IKE by another leightweight KMS, and also IEEE 802.15.4 MAC must have a KMS for its AES algorithm.
Can we use the same KMS for both of IPsec and IEEE 802.15.4 MAC? how?
thank you
Relevant answer