Recent publications
Critical chain buffer management (CCBM) has been extensively studied in recent years. This paper investigates a new formulation of CCBM, the multimode chance-constrained CCBM problem. A flow-based mixed-integer linear programming model is described and the chance constraints are tackled using a scenario approach. A reinforcement learning (RL)-based algorithm is proposed to solve the problem. A factorial experiment is conducted and the results of this study indicate that solving the chance-constrained problem produces shorter project durations than the traditional approach that inserts time buffers into a baseline schedule generated by solving the deterministic problem. This paper also demonstrates that our RL method produces competitive schedules compared to established benchmarks. The importance of solving the chance-constrained problem and obtaining a project buffer tailored to the desired probability of completing the project on schedule directly from the solution is highlighted. Because of its potential for generating shorter schedules with the same on-time probabilities as the traditional approach, this research can be a useful aid for decision makers.
The paper proposes a framework for thinking about digital technologies, including AI, in education. The framework combines Don Ihde's postphenomenology and Seymour Papert's constructionism. The former is rooted in the philosophy of technology, the latter-in education and technology. The intersections between the two theories have been mentioned but not explored. There are biographical affinities between the two thinkers, and their groundbreaking works were published in adjacent years-Ihde's Technics and Praxis in 1979 and Papert's Mindstorms: Children, Computers, and Powerful Ideas in 1980. The intersection between the two theories is examined through a review of constructionism through the prism of the Ihde's four postphenomenological relations showing how each relation matches a major constructionist thread: for embodiment relations, personalization fits; for hermeneutic relations-computational thinking; for alterity relations-microworld; and for background relations-democratization. The paper shows how the two theories contribute to each other and enrich the analysis. ARTICLE HISTORY
To test silicon photonics component performances, a silicon (Si) grating coupler (GC) is used to couple the light from a single-mode fiber (SMF) into the chip. However, silicon nitride (Si3N4) waveguides have recently become more popular for realizing photonic integrated circuits (PICs), which may be attributable to their exceptional characteristics, such as minimal absorption and low back reflection (BR) in the O-band spectrum. Thus, to test the photonic chip, a waveguide converter from Si3N4 to Si needs to be added to the photonic circuit, which can lead to more power losses and BR. To avoid this conversion, we propose in this manuscript a configuration of a GC based on Si3N4 structures, which can be employed to minimize the footprint size and obtain better performance. The achievement of high efficiency was possibly obtained by optimizing the structural properties of the waveguide and the coupling angle from the SMF. The results demonstrated high efficiency within the O-band spectrum by using a wavelength of 1310 nm. Notably, at this specific wavelength, the findings indicated a coupling efficiency of −5.52 db. The proposed design of the GC consists of a uniform grating that offers improvements regarding affordability and simplicity in manufacturing compared to other GC models. For instance, using a reflector or a GC with non-uniform grooved teeth introduces challenges in fabrication and incurs higher costs. Thus, the proposed design can be useful for improving the testing abilities of the Si3N4 photonic chips used in transceiver systems.
We will show that Reiter's default logic can be viewed as a particular instantiation of causal reasoning. This will be demonstrated by establishing back and forth translations between default theories and causal theories of the causal calculus under a particular nonmonotonic semantics of causal theories that will be called a default semantics. Moreover, it will be shown that Pearl's structural equation models can be viewed as default causal theories in this sense. We will discuss also some global consequences this representation could have for establishing a general role of causation in nonmonotonic reasoning.
We review spectroscopic methods developed for the determination of magnetic fields in high-energy-density (HED) plasmas. In such plasmas, the common Zeeman-splitting magnetic-field diagnostics are often impeded by various broadening mechanisms of the atomic transitions. The methods described, encompassing atomic transitions in the visible and ultraviolet spectral regions, are applied to the study of imploding plasmas (in a Z-pinch configuration) with and without pre-embedded magnetic fields, relativistic-electron focusing diodes, and plasma-opening switches. The measurements of the magnetic field in side-on observations of cylindrical-plasma configurations that are local in the radial direction despite the light integration along the chordal lines of sight are discussed. The evolution of the magnetic-field distributions obtained, together with the measurements of the plasma temperature and density, allows for studying the plasma dynamics, resistivity, and pressure and energy balance. In particular, for the Z-pinch, an intriguing question on the current flow in the imploding plasma was raised due to the observation that the current during stagnation mainly flows at relatively large radii, outside the stagnation region. For the premagnetized plasma implosions, all three components of the magnetic field (azimuthal, axial, and radial) were measured, yielding the evolution of the current flow and the efficiency of the axial field compression, as well as the relation between the geometry of the field and the plasma rotation, found to develop in this configuration. The measurements in the relativistic electron diode are used to quantify the shielding of the magnetic field by the plasmas in the diode. Also described are the experimental and theoretical investigations of a nondiffusive fast penetration of magnetic field into a low-density plasma (in the plasma-opening-switch configuration).
We study orthogonally additive operators on Riesz spaces. Our first result gives necessary and sufficient conditions on a pair of Riesz spaces (E, F) for which every orthogonally additive operator from E to F is laterally-to-order bounded. Second result extends an analogue of Pitt’s compactness theorem obtained by the second and third named authors for narrow linear operators to the setting of orthogonally additive operators. Third result provides sufficient conditions on a pair of orthogonally additive operators S and T to have S∨T\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$S \vee T$$\end{document}, as well as to have S∧T\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$S \wedge T$$\end{document} without any assumption on the domain and range spaces. Finally we prove an analogue of Meyer’s theorem on the existence of modules of disjointness preserving operator for orthogonally additive operators.
Industrial projects are plagued by uncertainties, often resulting in both time and cost overruns. This research introduces an innovative approach, employing Reinforcement Learning (RL), to address three distinct project management challenges within a setting of uncertain activity durations. The primary objective is to identify stable baseline schedules. The first challenge encompasses the multimode lean project management problem, wherein the goal is to maximize a project’s value function while adhering to both due date and budget chance constraints. The second challenge involves the chance-constrained critical chain buffer management problem in a multimode context. Here, the aim is to minimize the project delivery date while considering resource constraints and duration-chance constraints. The third challenge revolves around striking a balance between the project value and its net present value (NPV) within a resource-constrained multimode environment. To tackle these three challenges, we devised mathematical programming models, some of which were solved optimally. Additionally, we developed competitive RL-based algorithms and verified their performance against established benchmarks. Our RL algorithms consistently generated schedules that compared favorably with the benchmarks, leading to higher project values and NPVs and shorter schedules while staying within the stakeholders’ risk thresholds. The potential beneficiaries of this research are project managers and decision-makers who can use this approach to generate an efficient frontier of optimal project plans.
The built environment contributes to global carbon dioxide emissions with carbon-emitting building materials and construction processes. While achieving carbon-neutral construction is not feasible with conventional construction methods, microbial-based construction processes were suggested over three decades ago to reduce carbon dioxide emissions. With time, questions regarding scaling, predictability, and the applicability of microbial growth and biomass production emerged and still needed to be resolved to allow manufacturing. Within this opinion, we will discuss what can be achieved not to ‘grow a building’ per se but to ‘grow environmentally friendly biocement’. Elaborate pathways leading to the formation of cementitious materials by genetically manipulatable microorganisms have been described so far, providing options to enhance the suitability of these pathways for construction with synthetic biology and bio-convergence. These processes can also be combined with additional beneficial properties of cement-producing organisms, such as antimicrobial properties and carbon fixation by photosynthesis. Therefore, while we cannot yet ’grow a building’, we can grow and design biocement for the construction industry.
Red blood cell (RBC) deformability, expressing their ability to change their shape, allows them to minimize their resistance to flow and optimize oxygen delivery to the tissues. RBC with reduced deformability may lead to increased vascular resistance, capillary occlusion, and impaired perfusion and oxygen delivery. A reduction in deformability, as occurs during RBC physiological aging and under blood storage, is implicated in the pathophysiology of diverse conditions with circulatory disorders and anemias. The change in RBC deformability is associated with metabolic and structural alterations, mostly uncharacterized. To bridge this gap, we analyzed the membrane protein levels, using mass spectroscopy, of RBC with varying deformability determined by image analysis. In total, 752 membrane proteins were identified. However, deformability was positively correlated with the level of only fourteen proteins, with a highly significant inter-correlation between them. These proteins are involved in membrane rafting and/or the membrane-cytoskeleton linkage. These findings suggest that the reduction of deformability is a programmed (not arbitrary) process of remodeling and shedding of membrane fragments, possibly mirroring the formation of extracellular vesicles. The highly significant inter-correlation between the deformability-expressing proteins infers that the cell deformability can be assessed by determining the level of a few, possibly one, of them.
This article suggests several design principles intended to assist in the development of ethical algorithms exemplified by the task of fighting fake news. Although numerous algorithmic solutions have been proposed, fake news still remains a wicked socio-technical problem that begs not only engineering but also ethical considerations. We suggest employing insights from ethics of care while maintaining its speculative stance to ask how algorithms and design processes would be different if they generated care and fight fake news. After reviewing the major characteristics of ethics of care and the phases of care, we offer four algorithmic design principles. The first principle highlights the need to develop a strategy to deal with fake news on the part of the software designers. The second principle calls for the involvement of various stakeholders in the design processes in order to increase the chances of successfully fighting fake news. The third principle suggests allowing end-users to report on fake news. Finally, the last principle proposes keeping the end-user updated on the treatment in the suspected news items. Implementing these principles as care practices can render the developmental process more ethically oriented as well as improve the ability to fight fake news.
Over the years empirical evidence has shown that traffic enforcement reduces traffic violations, crashes, and casualties. However, less attention has been paid to enforcement coverage across different populations and driver characteristics. The current study develops and explores a method for estimating police enforcement coverage, by comparing the share of drivers across several characteristics who received tickets from automatic speed and red-light cameras - as an objective estimate of offenses committed - to the share of drivers who received tickets through manual police enforcement. Using data from all speeding and red-light tickets issued to Israelis over a period of one and a half years, we found under-enforcement by police officers for female drivers, two-wheeled vehicle drivers (for speeding), and drivers with previous tickets. We found over-enforcement for younger drivers, truck drivers, and two-wheeled vehicle drivers (for red-light offenses). The findings suggest that the method developed in the research is able to identify groups of drivers who are over- or under-enforced. Police authorities can use this information to create evidence-based enforcement policies.
Background:
Cognitive deficits in Parkinson's disease (PD) patients are well described, however, their underlying neural mechanisms as assessed by electrophysiology are not clear.
Objectives:
To reveal specific neural network alterations during the performance of cognitive tasks in PD patients using electroencephalography (EEG).
Methods:
Ninety participants, 60 PD patients and 30 controls underwent EEG recording while performing a GO/NOGO task. Source localization of 16 regions of interest known to play a pivotal role in GO/NOGO task was performed to assess power density and connectivity within this cognitive network. The connectivity matrices were evaluated using a graph-theory approach that included measures of cluster-coefficient, degree, and global-efficiency. A mixed-model analysis, corrected for age and levodopa equivalent daily dose was performed to examine neural changes between PD patients and controls.
Results:
PD patients performed worse in the GO/NOGO task (P < 0.001). The power density was higher in δ and θ bands, but lower in α and β bands in PD patients compared to controls (interaction group × band: P < 0.001), indicating a general slowness within the network. Patients had more connections within the network (P < 0.034) than controls and these were used for graph-theory analysis. Differences between groups in graph-theory measures were found only in cluster-coefficient, which was higher in PD compared to controls (interaction group × band: P < 0.001).
Conclusions:
Cognitive deficits in PD are underlined by alterations at the brain network level, including higher δ and θ activity, lower α and β activity, increased connectivity, and segregated network organization. These findings may have important implications on future adaptive deep brain stimulation. © 2023 The Authors. Movement Disorders published by Wiley Periodicals LLC on behalf of International Parkinson and Movement Disorder Society.
Significance:
Diabetes is a prevalent disease worldwide that can cause severe health problems. Accurate blood glucose detection is crucial for diabetes management, and noninvasive methods can be more convenient and less painful than traditional finger-prick methods.
Aim:
We aim to report a noncontact speckle-based blood glucose measurement system that utilizes artificial intelligence (AI) data processing to improve glucose detection accuracy. The study also explores the influence of an alternating current (AC) induced magnetic field on the sensitivity and selectivity of blood glucose detection.
Approach:
The proposed blood glucose sensor consists of a digital camera, an AC-generated magnetic field source, a laser illuminating the subject's finger, and a computer. A magnetic field is applied to the finger, and a camera records the speckle patterns generated by the laser light reflected from the finger. The acquired video data are preprocessed for machine learning (ML) and deep neural networks (DNNs) to classify blood plasma glucose levels. The standard finger-prick method is used as a reference for blood glucose level classification.
Results:
The study found that the noncontact speckle-based blood glucose measurement system with AI data processing allows for the detection of blood plasma glucose levels with high accuracy. The ML approach gives better results than the tested DNNs as the proposed data preprocessing is highly selective and efficient.
Conclusions:
The proposed noncontact blood glucose sensing mechanism utilizing AI data processing and a magnetic field can potentially improve glucose detection accuracy, making it more convenient and less painful for patients. The system also allows for inexpensive blood glucose sensing mechanisms and fast blood glucose screening. The results suggest that noninvasive methods can improve blood glucose detection accuracy, which can have significant implications for diabetes management. Investigations involving representative sampling data, including subjects of different ages, gender, race, and health status, could allow for further improvement.
Multi-agent task allocation in physical environments with spatial and temporal constraints, are hard problems that are relevant in many realistic applications. A task allocation algorithm based on Fisher market clearing (FMC_TA), that can be performed either centrally or distributively, has been shown to produce high quality allocations in comparison to both centralized and distributed state of the art incomplete optimization algorithms. However, the algorithm is synchronous and therefore depends on perfect communication between agents. We propose FMC_ATA, an asynchronous version of FMC_TA, which is robust to message latency and message loss. In contrast to the former version of the algorithm, FMC_ATA allows agents to identify dynamic events and initiate the generation of an updated allocation. Thus, it is more compatible for dynamic environments. We further investigate the conditions in which the distributed version of the algorithm is preferred over the centralized version. Our results indicate that the proposed asynchronous distributed algorithm produces consistent results even when the communication level is extremely poor.
We find the intervals [α, β(α)] such that if a univariate real polynomial or entire function f (z) = a0 + a1z + a2z 2 + · · · with positive coefficients satisfy the conditions a 2 k−1 a k−2 a k ∈ [α, β(α)] for all k ≥ 2, then f belongs to the Laguerre– Pólya class. For instance, from J.I. Hutchinson’s theorem, one can observe that f belongs to the Laguerre–Pólya class (has only real zeros) when qk(f) ∈ [4, +∞). We are interested in finding those intervals which are not subsets of [4, +∞).
MSC Classification: 30C15; 30D15; 30D35; 26C10
Project management goals often lead the project to an ideal quality, standing budget and schedule, and achieving customer satisfaction. Project management in municipal infrastructure deals with accompanying the developer (municipal authority), planning management, approving budgets with relevant regulators, viewing, and removing bureaucratic barriers, coordinating stakeholders, and controlling the contractor until project delivery to the client. In this paper, we examine the performance of municipal projects and the degree of stakeholder involvement; the analysis focused on 50 municipal infrastructure projects completed in the last two years and carried out in Israel. We used Data Envelopment Analysis (DEA) to examine the relative efficiency of project implementation. The analysis is according to the projects' success and characteristics and criteria such as schedule overrun, stakeholder involvement, residents' complaints, risks, and uncertainty at the beginning of the project. Preliminary results indicate the contribution of stakeholders, residents and their degree of involvement in the relative efficiency of projects. Therefore, this research will contribute to the success of municipal infrastructure projects by establishing a policy of allocating resources to municipal infrastructure projects following the level of decision-makers and their impact.
Data-driven economic tasks have gained significant attention in economics, allowing researchers and policymakers to make better decisions and design efficient policies. Recently, with the advancement of machine learning (ML) and other artificial intelligence (AI) methods, researchers can now solve complex economic tasks with previously unseen performance and ease. However, to use such methods, one is required to have a non-trivial level of expertise in ML or AI, which currently is not standard knowledge in economics. In order to bridge this gap, automatic machine learning (AutoML) models have been developed, allowing non-experts to efficiently use advanced ML models with their data. Nonetheless, not all AutoML models are created equal in general, particularly for the unique properties associated with economic data. In this paper, we present a benchmarking study of biologically inspired and other AutoML techniques for economic tasks. We evaluate four different AutoML models alongside two baseline methods using a set of 50 diverse economic tasks. Our results show that biologically inspired AutoML models (slightly) outperformed non-biological AutoML in economic tasks, while all AutoML models outperformed the traditional methods. Based on our results, we conclude that biologically inspired AutoML has the potential to improve our economic understanding while shifting a large portion of the analysis burden from the economist to a computer.
This paper presents a new design for a 1 × 4 optical power splitter using multimode interference (MMI) coupler in silicon nitride (Si3N4) strip waveguide structures. The main functionality of the proposed design is to use Si3N4 for dealing with the back reflection (BR) effect that usually happens in silicon (Si) MMI devices due to the self-imaging effect and the higher index contrast between Si and silicon dioxide (SiO2). The optimal device parameters were determined through numerical optimizations using the beam propagation method (BPM) and finite difference time domain (FDTD). Results demonstrate that the power splitter with a length of 34.6 μm can reach equal distribution power in each output port up to 24.3% of the total power across the O-band spectrum with 0.13 dB insertion loss and good tolerance MMI coupler parameters with a shift of ±250 nm. Additionally, the back reflection range over the O-band was found to be 40.25-42.44 dB. This demonstrates the effectiveness of the incorporation using Si3N4 MMI and adiabatic input and output tapers in mitigating unwanted BR to ensure that a good signal is received from the laser. This design showcases the significant potential for data-center networks, offering a promising solution for efficient signal distribution and facilitating high-performance and reliable optical signal routing within the O-band range. By leveraging the advantages of Si3N4 and the MMI coupler, this design opens possibilities for advanced optical network architectures and enables efficient transmission of optical signals in the O-band range.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
52 Golomb St. , 5883754, H̱olon, Israel
Website
https://www.hit.ac.il/