Science topics: EngineeringEngineering EducationComputer Science Education
Computer Science Education - Science topic
Explore the latest questions and answers in Computer Science Education, and find Computer Science Education experts.
Questions related to Computer Science Education
The concept of formal system and/or its properties is present frequently in many practical and theoretical components of computer science methods, tools, theories, etc.
But it is frequent too, finding some non rigorous interpretations of formal. For example, in several definitions of ontology, formal is understood as something that "computer can understand".
Does the computer science specialist, BSc, need to know that concept? Are it and its properties useful for them?
We are probably all aware by now that artificial intelligence will disrupt virtually every single industry globally. It will eliminate millions of old jobs and create millions of new ones. Memorization, multiple-choice testing, or rote work will no longer be relevant to these new jobs. Even jobs in the service sector that traditionally need the "human touch," like nursing and teaching will be affected in one way or another.
Unfortunately, the developing world will probably make attempts to adhere to "tradition" and resist change. How can we democratize technology so that students gain the data and computer literacy needed to succeed in the job market of the future despite this resistance?
Isn't it time to bury the master class in computer engineering training without mourning but rather adapting new pedagogical models based on new technologies and especially artificial intelligence?
What kind of scientific research dominate in the field of Computerization of conducting economic and financial analyzes of enterprises?
Please, provide your suggestions for a question, problem or research thesis in the issues: Computerization of conducting economic and financial analyzes of enterprises.
I invite you to the discussion
I wondering if you know texts, guides or any didactical advice that can help is to teach discrete wavelet theory to undergeaduate computer scientist
Hi, as part of my bachelor thesis on the design of programming languages for teaching mathematics in the 21st century, I have planned to discuss the evolution of (the) major programing languages which focus on the idea that computer programming could play an integral role in STEM education.
In order to analyze different programming languages as a framework for teaching (primarily) mathematical concepts, I am currently searching for (citable) research projects providing insights into the historical development of educational programming languages. – Are you familiar with any research on the evolution of educational programming languages?
Many thanks in advance for your contributions,
Computer programming languages evolved with time. Teaching students to program involves building the essential thinking skills; e.g. be able to analyse the problem, to develop the algorithm and to map that algorithm into certain programming constructs in certain order and construction based on the programming language of choice itself. An engineer in the field might welcome the capabilities embedded in certain new languages, such as Python, which includes lots of ready libraries which reliefs him from thinking about the details of many elements in his overall problem. Python to him looks like a turn-key solution. On the other hand, a student in the university, especially in his early stages of learning programming, has not yet developed the associated thinking skills and applying using a bare programming language with no such big libraries, such as C, C++ and Java can be more educational. Additionally, an option can be to teach basic programming using a skimmed Python, but this is in my opinion like teaching some one to drive a car using an autonomous car but asking him no to use the auto parking and auto driving functionalities.
I raise this discussion as there is a tendency here and there to teach Python at early stages to computer science students in the universities in contrast to a few years ago where C/C++ was a focus at early years of study regardless. Do you agree or not?. Are there other options?, What is the current norm in your institution?. Your contributions are welcomed.
Robots due to cost are a limited resource for teaching but useful. They engage students and make concrete principles but it is not possible to have one robot per student for both cost and space reasons. How can I get the same benefits of robots for teaching AI through other methods?
Article Neural nets
Currently there is a trend to apply virtual methods, ICT based, to teaching in HE . Frequently professors face the situation, when they have been teaching in face-to-face modality and want/need to do the same, but in distance education, virtual modality. Do anybody have a practical experience, or knowing about a specific methodology for Computer Science courses?
I wish to direct my research in Computer Science Education erea, now i would like to have more support to be focused on a specific target.
Among the engineering and science female student enrollments in my school in the last 5 years, the highest number in most of those years happened to be in computer science. Am interested in investigating the reasons for such interest in CS. I will appreciate contributions from researchers having any information on this.
In my experience logic is not taught at the appropriate age, if at all. At best it is mostly taught in an ad hoc manner. Given that an elementary classical education (Trivium) is comprised of grammar, logic and rhetoric. Grammar and rhetoric (writing, and to a lesser degree verbal presentation) are taught almost as soon as a child enters school and continue throughout school. In my experience, logic, as a subject discipline, is not widely taught from an early age. I feel this hinders mental development because students are taught reading, writing, and even argumentation without the benefit of logic which helps one sort through the noise like a mental antivirus. I'd like to get other's thoughts on this. -- Dr. Sikorski
This question might seem personal and of course, it is, but many of my friends are facing this problem.
I am a sophomore student and I want to do research in the field of electrical and electronic engineering. I am not quite sure about the specific field but I am finding interest in power electronics, embedded circuit and integrated circuits. But I want to know why should I be doing higher studies and research in these fields? Can somebody please tell me the future aspects of these branches? It would be very helpful for us.
Thank you in advance.
if possible please include the conceptual framework and questionnaires
Is there any validated instrument? I'm specially interested on initial programming but until now I only found instruments for advanced programming.
I developed a "learning object" (basically an interactive applet with additional pedagogic components) for the Little Man Computer. In the ten years since, it has probably been used by more than a million people (largely due to its specification as a recommended resource for an A-level course in the UK). As such, it's probably the single most impactful contribution I've made and yet it counts zero for my academic career.
We are trying to reduce the amount of assessment to make everyone's life easier. In my module "G54SIM: Simulation for Decision Support" which I teach in the School of Computer Science we currently have a shared assessment model: 40% coursework and 60% exam. You can find examples of these assessments on the module website.
I am now in the process of planning the assessment for the next semester. But I am not sure how I should design the assessment. I need to be sure that I am assessing the student's abilities and not the ability of other students ;-). I need to make sure that I reduce the amount of overall assessment and don't increase the amount of coursework assessment too much. Also I motivate the students during labs (we have 10 two hour labs) to work in small groups as conceptual problems can be solved much quicker this way and students learn interacting with clients and expressing their thoughts.
I am now thinking of having a group coursework (25%) early in the semester and an individual coursework (75%) late in the semester. Part of the individual coursework assessment would be a 10 minute oral exam to ensure students did the work themselves. I was told that telling the students in advance that they will be orally examined would prevent plagiarism.
The big question is: Am I reducing the assessment (amount of work for students and lecturers) in this way or am I just changing the format of assessment? What do you think?
Can you please share your experience of coursework only assessment in computer science or related fields?
Can you point out literature that discusses this topic?
Computer Science education;
I want to organize an "Hour of Code" event next week in my school for an hour. I want to determine effectiveness of this event. Do you have any idea to determine the effectiveness of this very short event. Thanks your response...
Possibilities include IBM Bluemix, Google Cloud Platform, Azure, Amazon etc. Have you any experience using these? In particular for A. spinning up VMs that students have been working on and B. Easily creating web service components for mobile apps (Android). Have you participated in Academic Initiatives for Cloud services?
(My cohort is studying online/by distance towards a BSc. in IT.)
Mathematics as conceived and taught, especially in disciplines considered basic, considers concepts or dogmas that lead to problem solving in a deterministic way. That is, the solutions are always exact and without restriction even when considering some form of prediction given by a mathematical model. Thus, in the teaching of mathematics, the goal is always to find the exact solution of the problem, which is achieved by the methods from classical logic. This process of mathematics teaching and learning has been used in all programs known throughout our life. Mathematics as a tool to analyze ill-defined or inaccurate situations restricted to stochastic methods invariably lack of sample data and statistics taught in schools has aimed mostly descriptive processes.
In real life, however, there are many situations in which either do not have a sample database or the data are quite inaccurate or incomplete and there is the need to make decisions based on subjective or poorly defined concepts, such as , large, low, strong, beautiful, very etc. In such cases the mathematical deterministic or stochastic processes do not allow a solution. However, ill-defined problems are often "resolved" intuitively.
The fuzzy logic works with subjectivity, and to quantify the uncertainties to solve real problems that do not have exact solution. Mathematics from the fuzzy logic is relatively simple and well adaptable to real situations.
We are trying to decide descriptors to evaluate the learning of software programming in Primary School.
We are looking for detecting the learner's motivational level in a computer-based educational environments.
So we need at first a motivational model? Searching in some work, there are no explicit complete model to implement, only some pieces .
if i work with quantitative method..the output should model? if mix method.. the framework is suitable for the output, is that right?
What motivates you to continue with scientific research? Is it money, reputation, competition, your institution rules, your wish to search for facts, that you want to serve the humanity, because it's your job, or for other reasons? For the universities in third world countries how can we motivate scientific research in your opinion?
Micheal Staton said that "Stack Overflow is the new Computer Science department, where people go to learn together"
If we replace "Computer Science" with the more practical "Computer Programming", what is your opinion on this quote?
On one side, The Stack Overflow dataset has great potential for education, being the largest dataset of solved exercises, with a relatively good content and with a strong problem-based approach.
However, Stack Overflow, as it is, has clear limitations from an educational point of view: lack of structure, content too fragmented, potential problem of reliability of the content and there is little to support a learner (indeed, there is a lot more to support a professional). Stack Overflow has been criticized often for only encouraging "cut&paste" habits and quick answers.
My question is: Can we turn Stack Overflow ( and similar CoP) into a tool for teaching Computer Programming?
Maybe yes, maybe yes as a supplemental material for quizzes or exercises, or maybe no.
I am developing a research project to investigate the potential of such online communities in a teaching&learning context.
I'll be grateful if you can help me with your opinion.
If you want to go the extra mile, I have a 5-minute survey on the topic:
If you are a lecturer use this link: http://eSurv.org/?u=COP_lecturers
If you are a student (any level): http://eSurv.org/?u=online_COP_students
English is an imprecise language to begin with (e.g. compared to Baltic languages with noun variations, and to French which has more verb conjugation formats), and grammar is being taught less and less rigidly in many high schools now. Can this (growing) imprecision in spoken languages affect the ability of students to handle the extreme precision required to program? Are programming languages such as Python better than programming languages such as Java better suited to introductory programming because they require less precision?
Programming languages for beginners
We have already disclosed the basic idea of the historically first RTL gate
It was surprising but true that the RTL NOR gate was an analog device implemented by humble resistors while “true digital” logic gates were (are) implemented by electronic switching elements – diodes (DL, DTL and TTL gates), bipolar transistors (ECL gates) and MOS transistors (NMOS, PMOS and CMOS logic gates):
Although RTL stays away from all these logic families, it is still interesting to see if there is some connection between them... some common general idea... Let’s try to find it...
Remember that in RTL we summed voltages by converting them to currents. But this circuit was sensitive to the magnitudes of the voltages and resistances; in addition, the number of inputs was limited. It seems we can sum, besides voltages and currents, why not resistances as well? Here is the implementation.
The input logic variables turn on (at logic "1") or turn off (at logical "0") equal reference resistances (conductances). They are summed by an analog summer again; their sum is converted to voltage and compared by a threshold device (voltage comparator) whose threshold is lower than one reference. So it is sufficient that only one reference is turned on and the output is set at logic state "1".
This idea is taken to the extreme in the classic DL, DTL, TTL, MOS and CMOS circuits where the reference resistances are increased up to infinity. In practice, they are implemented by diode or transistor switches operated by the logic input variables. They are connected in series to sum the switch resistances or in parallel to sum their conductances (DL, DTL and TTL use only a parallel connection).
Because the sum of the included resistances/conductances is infinite (even if only one element is connected), the comparator can have an any threshold within the supply voltage. In this situation, it is sufficient only one switch in series/parallel to be open/closed so that the total resistance/conductance becomes infinitely large, and the threshold element switches. This element can even be absent if the thresholds of the next electrically operated switches are used (MOS and CMOS logic elements exploit this idea).
Depending on the way of connection (in series/parallel), the correlation between the values of the logical input variables and the state of the switches, as well as on the presence of an inverter at the output, OR (NOR) or AND (NAND) elements can be obtained. For example, in a DTL circuit the diode switches are connected in parallel, the correlation is “logic ‘0’ – open switch” and “logic ‘1’ - closed switch ); as a result, a NAND logic gate is obtained.
As a conclusion, my extravagant idea about the essence of electronic logic gates is:
Digital logic gates (all kinds) are implemented in electronics by cascading two devices – a “summer” (a pure analog device) and a “comparator” (a mixed analog-to-digital device); the discrete (binary) Boolean logic functions OR/AND are implemented by the arithmetical summation.
If we teach CS to enable students to understand the world better, the connections between some phenomena of everyday life and the concepts of Computer Science that cause these phenomena are good starting points to design the lessons. A phenomenon is something that can be experienced with our senses in real life or imagenatively.
I'm interested in the ones you use and your examples that are connected to them for further research on them.
I am currently developing a 14-week course for computer science undergraduate students that focuses more on the application of control systems and the design and implementation of simple controllers using microcontrollers. The ability to simulate such controllers using Matlab is thought to be essential as well. Can you recommend the core topics needed in order to develop such course? Assumptions are that students have basic knowledge on 1) computer science and engineering mathematics; 2) basics on signals and systems, and 3) basics on microcontroller system design.
Massive open online courses (MOOCs) are a big topic these days on university campuses. There appears to be a divide in support of MOOCs and against MOOCs. What are your thoughts about integrating MOOCs into your institution?
Although scientific subjects like computer science are rich in math which is a common language for learning and teaching. Learning computer science in English as a second language can be harder to those whose second language is something else. Translation can help but to a degree. The direct translation of one term doesn't give the student the same clues as the clues got by those who are English native. After all the term can have much cultural background that may help understanding the term. This is especially for terms that has same/similar cultural and computer science use. For example, for the first time I heard of "Adhoc networking" I had no clue of what kind of a networking the Ad hoc ones may be because I don't know its general use in English while at the same time the translation is almost useless "Adho=for the purpose" . I presume that an English student got all the clues cause he used to use it to describe for example "Adhoc relations= unplanned relations". Knowing this he would have immediately conclude that "adhoc networks= unplanned networks", which is the key idea of ad hoc networks.
We are amidst a research work on the language and cultural barriers to learning computer science. We appreciate your help.
We need from you to suggest computer science terms like this above one which you believe that it has cultural background and direct translation can't be helpful in this sense.
I temporarily leave the “kingdom” of my favorite analog electronics and move to the neighboring field of digital electronics because this term I will conduct a series of labs in the laboratory on digital circuits. I do it since I want to be helpful to my students (both useful and fun) secretly hoping they will join my web initiative in return. Of course, I will not waste their and your time with banal, trite and boring book explanations; instead I will “provoke” readers with unusual, extraordinary and sometimes weird viewpoints with only the purpose to make them not only know but think and understand digital circuits...
For quite a long time I have discovered that there is a close connection between the input parts of the transistor-transistor logic (TTL), diode-transistor logic (DTL) and diode logic (DL). Thus TTL includes DTL... and DTL includes DL... or DL have evolved to DTL... and then DTL evolved to TTL... So, it seems to understand what TTL is, we have first to understand what DL is... to unveil the mystery about it. I did it yesterday together with my students by reinventing the circuit to ask all these questions:
"Why were the diodes back to front? Why was the resistor connected to +V instead to the ground? Why was there no input current when the input voltage was high? And why was there input current when the input voltage was low? But why did the current go out of the diodes and went in the input source? Why was it impossible to make inverting diode gate? Why was AND gate supplied by an additional voltage source while OR gate had no such a source?.
In logic gates, logic functions are performed by parallel (OR function) or series (AND function) connected switches that are electrically controlled by the input logical variables. Diode-based logic gates (DL, DTL and TTL) are implemented by diode switches (when forward biased, a diode is “closed”; when backward biased, it is “open”). The paradox of the diode logic is that diode AND logic gates should be implemented by series connected diode switches (like an NMOS AND gate implemented by series connected transistor switches)... but still it is implemented by parallel connected switches. Why? Here is my explanation.
In contrast to transistors, diodes are odd two-terminal switching elements, in which the input and output are not separated; they are the same. So, series connected diode switches cannot be driven by grounded input voltage sources. To solve this problem, diode AND gates may be constructed in the same manner as OR diode gates - by parallel connected diode switches. But to obtain AND instead of OR function, according to De Morgan's laws, the input (X) and output (Y) logical variables should be inverted:
Y = NOT (NOT (X1) OR NOT (X2)) = NOT (NOT (X1 AND X2)) = X1 AND X2
So, the diode AND logic gate is a modified diode OR logic gate: the diode AND gate is actually a diode OR gate with inverted inputs and output. Let’s see how it is implemented in the ubiquitous circuit (see the attachment).
To realize the clever De Morgan's idea, the diodes are reverse connected and forward biased by an additional voltage source +V (the power supply 2) through the “pull-up” resistor R1. The input voltage sources are connected in opposite direction to the supplying voltage source (traveling along the loop +V - R1 - D - Vin). To invert the output voltage and to get a grounded output, the complementary voltage drop (+V - VR1) between the output and ground is taken as an output instead the floating voltage drop VR1 across the resistor.
Input logical “1”: When all the input voltages are high, they "neutralize" the biasing supply voltage +V. The voltage drops across the diodes are zero and these diode switches are “open”. The output voltage is high (output logical 1) since no current flows through the resistor and there is no voltage drop across it. So, the behavior of the diode switches is reversed - whereas in diode OR logic gates diodes act as normally open switches, in diode AND logic gates diodes act as normally closed switches.
Input logical “0”: If the voltage of some input voltage source is low, the power supply passes current through the resistor, diode and the input source. The diode is forward biased (the diode switch is “closed”) and the output voltage drop across the diode is low (output logical 0). The rest of diodes connected to high input voltages (input logical “1”s) are backward biased and their input sources are disconnected from the output 1.
I exposed my speculations, two years ago, in the Wikipedia page about diode logic (under the name Circuit dreamer):
I'm interested in cataloging all of them. I have a particular interest for ontologies in computer science, but any other topic is also relevant.
This is my list so far:
Travelling salesman problem
Eight queens puzzle
Two Generals' Problem
The Muddy Children Puzzle
Yale shooting problem
Tower of Hanoi
thanks for all contributions :D
While outside the academic sector, a great majority of consultants and programmers cherish the notion of "productivity" through higher levels of abstraction. Should "productivity" be measured by the time saved for the programmer or the optimized code underneath? While they are not necessarily exclusive they usually are.
Should this be the state of affairs?
Comments and criticism welcomed.
What are the different cost estimation techniques used for approximating cost or efforts for developing web applications.
I would like to know what books are considered, from the personal experience point of view, more suitable for teaching Theorem Proving, when teaching this topic for Computer Science students. Usually it is a very complicated issue for students, specificaly those studing in first or second courses.
From year to year the elementary knowledge of programming for first-year students and the enrollment into the computer science and / or information technology, is becoming less and less. Also abandonment (drop-out) of these studies is growing.
I'm interested in your experience with university drop-out of CS / IT studies, if any?
When I was kid, I played a lot of video games, and was always curious on how they worked. Eventually, I started making my own video games in QBasic. Later, I learned C++ so that I could make more advanced games. I loved it so much, and over the years, I became hungry to learn more about mathematics, programming, and physics within the context of the games I was making.
I believe that game development can be a fun way to get kids interested in science and technology. And today, there seem to be a lot of options available for kids. But what are the best ways to get young kids involved in it all? Are there any good frameworks, schools, resources, etc that are appropriate for kids interested in exploring and experimenting with game development?
Back in 1982, Japan's Ministry of International Trade and Industry, begun the project “The Fifth Generation Computer Systems project”. The idea was to find the new architecture of computer, i.e. NON Von Neumann architecture, with Sequential Inference Machine Programming Operating System (SIMPOS) operating system is released. SIMPOS is programmed in Kernel Language 0 (KL0), a concurrent Prolog-variant with object oriented extensions. Similar project was in US, the results were various Lisp machine companies and of course, Thinking Machines.
I’m interested what happened with Japanese “Prolog machine”? Does anyone know something about that?
What are the best techniques to get the balance right when you are teaching a first year class that is mainly computing students, but a minority of arts-based students and keep all in engaged (as far as possible)?
I teach several computer science courses that involve programming in C, Assembly, even Verilog and other languages. The courses are about programming, operating systrems, digital design, wireless sensor networks, and so on. Some I grade by hand, some using Makefiles, shell scripts or other tools. I was wondering what are the tools that you use for automated grading, that may include running programms in a sandbox and checking the output with usin regular expressions, or even for automated advising on coding practices, for, say, a class of 10 to 200 or more.
The Wikipedia says:
"To convert from a base-10 integer numeral to its base-2 (binary) equivalent, the number is divided by two, and the remainder is the least-significant bit. The (integer) result is again divided by two, its remainder is the next least significant bit. This process repeats until the quotient becomes zero."
But my students ask, "Why? What do we actually do when repeatedly dividing by two? Why is the first bit LSB?"
I have an explanation but it is interesting for me to see your opinion.
If the project is going to be successful, it's another milestone for the uplift of technological sciences. I'm thinking of doing the same project in college.
As a teaching assistant, I'm involved in creating exercises for a master level security course. My goal is to teach practical aspects of security. As we are currently discussing software security, I think it is interesting to go somewhat into software verification. In my studies, I've personally had some interesting encounters with several static code verifiers (ESC/Java for Java, PREfast for C). However, I'm wondering if there are more actively developed tools available by now. I've found Microsoft's VCC , and a few others, like Mozilla's Pork, but neither seem particularly focused on security. Does anyone have interesting projects to share?
 http://vcc.codeplex.com/ (reference updated as per Ernie Cohen's answer)
I want to know about the status of cost estimation techniques used in software cost estimation at present.
What is the scope of research in this area?