Search results for: quantum key distribution systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14339

Search results for: quantum key distribution systems

9629 Next Generation of Tunnel Field Effect Transistor: NCTFET

Authors: Naima Guenifi, Shiromani Balmukund Rahi, Amina Bechka

Abstract:

Tunnel FET is one of the most suitable alternatives FET devices for conventional CMOS technology for low-power electronics and applications. Due to its lower subthreshold swing (SS) value, it is a strong follower of low power applications. It is a quantum FET device that follows the band to band (B2B) tunneling transport phenomena of charge carriers. Due to band to band tunneling, tunnel FET is suffering from a lower switching current than conventional metal-oxide-semiconductor field-effect transistor (MOSFET). For improvement of device features and limitations, the newly invented negative capacitance concept of ferroelectric material is implemented in conventional Tunnel FET structure popularly known as NC TFET. The present research work has implemented the idea of high-k gate dielectric added with ferroelectric material on double gate Tunnel FET for implementation of negative capacitance. It has been observed that the idea of negative capacitance further improves device features like SS value. It helps to reduce power dissipation and switching energy. An extensive investigation for circularity uses for digital, analog/RF and linearity features of double gate NCTFET have been adopted here for research work. Several essential designs paraments for analog/RF and linearity parameters like transconductance(gm), transconductance generation factor (gm/IDS), its high-order derivatives (gm2, gm3), cut-off frequency (fT), gain-bandwidth product (GBW), transconductance generation factor (gm/IDS) has been investigated for low power RF applications. The VIP₂, VIP₃, IMD₃, IIP₃, distortion characteristics (HD2, HD3), 1-dB, the compression point, delay and power delay product performance have also been thoroughly studied.

Keywords: analog/digital, ferroelectric, linearity, negative capacitance, Tunnel FET, transconductance

Procedia PDF Downloads 201
9628 The Use of Degradation Measures to Design Reliability Test Plans

Authors: Stephen V. Crowder, Jonathan W. Lane

Abstract:

With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. In this work we present a case study involving an electronic component subject to degradation. The data, consisting of 42 degradation paths of cycles to failure, are first used to estimate a reliability function. Bootstrapping techniques are then used to perform power studies and develop a minimal reliability test plan for future production of this component.

Keywords: degradation measure, time to failure distribution, bootstrap, computational science

Procedia PDF Downloads 538
9627 Record Peak Current Density in AlN/GaN Double-Barrier Resonant Tunneling Diodes on Free-Standing Gan Substrates by Modulating Barrier Thickness

Authors: Fang Liu, Jia Jia Yao, Guan Lin Wu, Ren Jie Liu, Zhuang Guo

Abstract:

Leveraging plasma-assisted molecular beam epitaxy (PA-MBE) on c-plane free-standing GaN substrates, this work demonstrates high-performance AlN/GaN double-barrier resonant tunneling diodes (RTDs) featuring stable and repeatable negative differential resistance (NDR) characteristics at room temperature. By scaling down the barrier thickness of AlN and the lateral mesa size of collector, a record peak current density of 1551 kA/cm2 is achieved, accompanied by a peak-to-valley current ratio (PVCR) of 1.24. This can be attributed to the reduced resonant tunneling time under thinner AlN barrier and the suppressed external incoherent valley current by reducing the dislocation number contained in the RTD device with the smaller size of collector. Statistical analysis of the NDR performance of RTD devices with different AlN barrier thicknesses reveals that, as the AlN barrier thickness decreases from 1.5 nm to 1.25 nm, the average peak current density increases from 145.7 kA/cm2 to 1215.1 kA/cm2, while the average PVCR decreases from 1.45 to 1.1, and the peak voltage drops from 6.89 V to 5.49 V. The peak current density obtained in this work represents the highest value reported for nitride-based RTDs to date, while maintaining a high PVCR value simultaneously. This illustrates that an ultra-scaled RTD based on a vertical quantum-well structure and lateral collector size is a valuable approach for the development of nitride-based RTDs with excellent NDR characteristics, revealing their great potential applications in high-frequency oscillation sources and high-speed switch circuits.

Keywords: GaN resonant tunneling diode, peak current density, peak-to-valley current ratio, negative differential resistance

Procedia PDF Downloads 66
9626 Optimizing the Design Parameters of Acoustic Power Transfer Model to Achieve High Power Intensity and Compact System

Authors: Ariba Siddiqui, Amber Khan

Abstract:

The need for bio-implantable devices in the field of medical sciences has been increasing day by day; however, the charging of these devices is a major issue. Batteries, a very common method of powering the implants, have a limited lifetime and bulky nature. Therefore, as a replacement of batteries, acoustic power transfer (APT) technology is being accepted as the most suitable technique to wirelessly power the medical implants in the present scenario. The basic model of APT consists of piezoelectric transducers that work on the principle of converse piezoelectric effect at the transmitting end and direct piezoelectric effect at the receiving end. This paper provides mechanistic insight into the parameters affecting the design and efficient working of acoustic power transfer systems. The optimum design considerations have been presented that will help to compress the size of the device and augment the intensity of the pressure wave. A COMSOL model of the PZT (Lead Zirconate Titanate) transducer was developed. The model was simulated and analyzed on a frequency spectrum. The simulation results displayed that the efficiency of these devices is strongly dependent on the frequency of operation, and a wrong choice of the operating frequency leads to the high absorption of acoustic field inside the tissue (medium), poor power strength, and heavy transducers, which in effect influence the overall configuration of the acoustic systems. Considering all the tradeoffs, the simulations were performed again by determining an optimum frequency (900 kHz) that resulted in the reduction of the transducer's thickness to 1.96 mm and augmented the power strength with an intensity of 432 W/m². Thus, the results obtained after the second simulation contribute to lesser attenuation, lightweight systems, high power intensity, and also comply with safety limits provided by the U.S Food and Drug Administration (FDA). It was also found that the chosen operating frequency enhances the directivity of the acoustic wave at the receiver side.

Keywords: acoustic power, bio-implantable, COMSOL, Lead Zirconate Titanate, piezoelectric, transducer

Procedia PDF Downloads 178
9625 Gig Economy Development Trends in Georgia

Authors: Nino Grigolaia

Abstract:

The paper discusses the importance of the development of the gig economy in the economy of Georgia, analyzes the trends of the development of the gig economy, and identifies the main challenges in this field. Objective. The objective of the study is to assess the role of the gig economy, identify the main challenges and develop recommendations. Methodologies. Analysis, synthesis, comparison, induction and other methods are used; A desk study has been conducted. Findings. The advantages and disadvantages of the gig economy are identified, and the impact of the changes caused by the development of the gig economy on labor relations and employment is determined. It is argued that the ongoing technological changes have led to the emergence of new global trends in the labor market and increased the inequality of income distribution. Conclusions. Based on the analysis of the gig economy in the world and in Georgia, relevant recommendations are proposed, namely: establishing a new system of regulating the incomes of employees in this field, developing a real social protection mechanism, Development of political and legal instruments for regulation of gig economy and others.

Keywords: gig economy, economy of Georgia, digital platforms, labor relations

Procedia PDF Downloads 72
9624 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought

Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan

Abstract:

Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.

Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin

Procedia PDF Downloads 67
9623 Comparative Analysis of Soil Enzyme Activities between Laurel-Leaved and Cryptomeria japonica Forests

Authors: Ayuko Itsuki, Sachiyo Aburatani

Abstract:

Soil enzyme activities in Kasuga-yama Hill Primeval Forest (Nara, Japan) were examined to determine levels of mineralization and metabolism. Samples were selected from the soil surrounding laurel-leaved (BB-1) and Carpinus japonica (BB-2 and Pw) trees for analysis. Cellulase, β-xylosidase, and protease activities were higher in BB-1 samples those in BB-2 samples. These activity levels corresponded to the distribution of cellulose and hemicellulose in the soil horizons. Cellulase, β-xylosidase, and chymotrypsin activities were higher in soil from the Pw forest than in that from the BB-2 forest. The relationships between the soil enzymes calculated by Spearman’s rank correlation indicate that the interactions between enzymes in BB-2 samples were more complex than those in Pw samples.

Keywords: comparative analysis, enzyme activities, forest soil, Spearman's rank correlation

Procedia PDF Downloads 598
9622 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System

Authors: Imran Dayan, Ashiqul Khan

Abstract:

Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.

Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining

Procedia PDF Downloads 340
9621 Use of Soil Microorganisms for the Production of Electricity through Microbial Fuel Cells

Authors: Abhipsa Mohanty, Harit Jha

Abstract:

The world's energy demands are continuing to rise, resulting in a worldwide energy crisis and environmental pollution. Because of finite, declining supply and environmental damage, reliance on fossil fuels is unsustainable. As a result, experts are concentrating on alternative, renewable, and carbon-free energy sources. Energy sources that are both environmentally and economically sustainable are required. Microbial fuel cells (MFCs) have recently received a lot of attention due to their low operating temperatures and ability to use a variety of biodegradable substrates as fuel. There are single-chamber MFCs as well as traditional MFCs with anode and cathode compartments. Bioelectricity is produced when microorganisms actively catabolize substrate. MFCs can be used as a power source in small devices like biosensors. Understanding of its components, microbiological processes, limiting variables, and construction designs in MFC systems must be simplified, and large-scale systems must be developed for them to be cost-effective as well as increase electricity production. The purpose of this research was to review current microbiology knowledge in the field of electricity. The manufacturing process, the materials, and procedures utilized to construct the technology, as well as the applications of MFC technology, are all covered.

Keywords: bio-electricity, exoelectrogenic bacteria, microbial fuel cells, soil microorganisms

Procedia PDF Downloads 96
9620 Internet of Things: Route Search Optimization Applying Ant Colony Algorithm and Theory of Computer Science

Authors: Tushar Bhardwaj

Abstract:

Internet of Things (IoT) possesses a dynamic network where the network nodes (mobile devices) are added and removed constantly and randomly, hence the traffic distribution in the network is quite variable and irregular. The basic but very important part in any network is route searching. We have many conventional route searching algorithms like link-state, and distance vector algorithms but they are restricted to the static point to point network topology. In this paper we propose a model that uses the Ant Colony Algorithm for route searching. It is dynamic in nature and has positive feedback mechanism that conforms to the route searching. We have also embedded the concept of Non-Deterministic Finite Automata [NDFA] minimization to reduce the network to increase the performance. Results show that Ant Colony Algorithm gives the shortest path from the source to destination node and NDFA minimization reduces the broadcasting storm effectively.

Keywords: routing, ant colony algorithm, NDFA, IoT

Procedia PDF Downloads 446
9619 Development of Vapor Absorption Refrigeration System for Mini-Bus Car’s Air Conditioning: A Two-Fluid Model

Authors: Yoftahe Nigussie

Abstract:

This research explores the implementation of a vapor absorption refrigeration system (VARS) in mini-bus cars to enhance air conditioning efficiency. The conventional vapor compression refrigeration system (VCRS) in vehicles relies on mechanical work from the engine, leading to increased fuel consumption. The proposed VARS aims to utilize waste heat and exhaust gas from the internal combustion engine to cool the mini-bus cabin, thereby reducing fuel consumption and atmospheric pollution. The project involves two models: Model 1, a two-fluid vapor absorption system (VAS), and Model 2, a three-fluid VAS. Model 1 uses ammonia (NH₃) and water (H₂O) as refrigerants, where water absorbs ammonia rapidly, producing a cooling effect. The absorption cycle operates on the principle that absorbing ammonia in water decreases vapor pressure. The ammonia-water solution undergoes cycles of desorption, condensation, expansion, and absorption, facilitated by a generator, condenser, expansion valve, and absorber. The objectives of this research include reducing atmospheric pollution, minimizing air conditioning maintenance costs, lowering capital costs, enhancing fuel economy, and eliminating the need for a compressor. The comparison between vapor absorption and compression systems reveals advantages such as smoother operation, fewer moving parts, and the ability to work at lower evaporator pressures without affecting the Coefficient of Performance (COP). The proposed VARS demonstrates potential benefits for mini-bus air conditioning systems, providing a sustainable and energy-efficient alternative. By utilizing waste heat and exhaust gas, this system contributes to environmental preservation while addressing economic considerations for vehicle owners. Further research and development in this area could lead to the widespread adoption of vapor absorption technology in automotive air conditioning systems.

Keywords: room, zone, space, thermal resistance

Procedia PDF Downloads 77
9618 Artificial Intelligence Protecting Birds against Collisions with Wind Turbines

Authors: Aleksandra Szurlej-Kielanska, Lucyna Pilacka, Dariusz Górecki

Abstract:

The dynamic development of wind energy requires the simultaneous implementation of effective systems minimizing the risk of collisions between birds and wind turbines. Wind turbines are installed in more and more challenging locations, often close to the natural environment of birds. More and more countries and organizations are defining guidelines for the necessary functionality of such systems. The minimum bird detection distance, trajectory tracking, and shutdown time are key factors in eliminating collisions. Since 2020, we have continued the survey on the validation of the subsequent version of the BPS detection and reaction system. Bird protection system (BPS) is a fully automatic camera system which allows one to estimate the distance of the bird to the turbine, classify its size and autonomously undertake various actions depending on the bird's distance and flight path. The BPS was installed and tested in a real environment at a wind turbine in northern Poland and Central Spain. The performed validation showed that at a distance of up to 300 m, the BPS performs at least as well as a skilled ornithologist, and large bird species are successfully detected from over 600 m. In addition, data collected by BPS systems installed in Spain showed that 60% of the detections of all birds of prey were from individuals approaching the turbine, and these detections meet the turbine shutdown criteria. Less than 40% of the detections of birds of prey took place at wind speeds below 2 m/s while the turbines were not working. As shown by the analysis of the data collected by the system over 12 months, the system classified the improved size of birds with a wingspan of more than 1.1 m in 90% and the size of birds with a wingspan of 0.7 - 1 m in 80% of cases. The collected data also allow the conclusion that some species keep a certain distance from the turbines at a wind speed of over 8 m/s (Aquila sp., Buteo sp., Gyps sp.), but Gyps sp. and Milvus sp. remained active at this wind speed on the tested area. The data collected so far indicate that BPS is effective in detecting and stopping wind turbines in response to the presence of birds of prey with a wingspan of more than 1 m.

Keywords: protecting birds, birds monitoring, wind farms, green energy, sustainable development

Procedia PDF Downloads 79
9617 Characterization of Inkjet-Printed Carbon Nanotube Electrode Patterns on Cotton Fabric

Authors: N. Najafi, Laleh Maleknia , M. E. Olya

Abstract:

An aqueous conductive ink of single-walled carbon nanotubes for inkjet printing was formulated. To prepare the homogeneous SWCNT ink in a size small enough not to block a commercial inkjet printer nozzle, we used a kinetic ball-milling process to disperse the SWCNTs in an aqueous suspension. When a patterned electrode was overlaid by repeated inkjet printings of the ink on various types of fabric, the fabric resistance decreased rapidly following a power law, reaching approximately 760 X/sq, which is the lowest value ever for a dozen printings. The Raman and Fourier transform infrared spectra revealed that the oxidation of the SWCNTs was the source of the doped impurities. This study proved also that the droplet ejection velocity can have an impact on the CNT distribution and consequently on the electrical performances of the ink.

Keywords: ink-jet printing, carbon nanotube, fabric ink, cotton fabric, raman spectroscopy, fourier transform infrared spectroscopy, dozen printings

Procedia PDF Downloads 426
9616 Algorithmic Obligations: Proactive Liability for AI-Generated Content and Copyright Compliance

Authors: Aleksandra Czubek

Abstract:

As AI systems increasingly shape content creation, existing copyright frameworks face significant challenges in determining liability for AI-generated outputs. Current legal discussions largely focus on who bears responsibility for infringing works, be it developers, users, or entities benefiting from AI outputs. This paper introduces a novel concept of algorithmic obligations, proposing that AI developers be subject to proactive duties that ensure their models prevent copyright infringement before it occurs. Building on principles of obligations law traditionally applied to human actors, the paper suggests a shift from reactive enforcement to proactive legal requirements. AI developers would be legally mandated to incorporate copyright-aware mechanisms within their systems, turning optional safeguards into enforceable standards. These obligations could vary in implementation across international, EU, UK, and U.S. legal frameworks, creating a multi-jurisdictional approach to copyright compliance. This paper explores how the EU’s existing copyright framework, exemplified by the Copyright Directive (2019/790), could evolve to impose a duty of foresight on AI developers, compelling them to embed mechanisms that prevent infringing outputs. By drawing parallels to GDPR’s “data protection by design,” a similar principle could be applied to copyright law, where AI models are designed to minimize copyright risks. In the UK, post-Brexit text and data mining exemptions are seen as pro-innovation but pose risks to copyright protections. This paper proposes a balanced approach, introducing algorithmic obligations to complement these exemptions. AI systems benefiting from text and data mining provisions should integrate safeguards that flag potential copyright violations in real time, ensuring both innovation and protection. In the U.S., where copyright law focuses on human-centric works, this paper suggests an evolution toward algorithmic due diligence. AI developers would have a duty similar to product liability, ensuring that their systems do not produce infringing outputs, even if the outputs themselves cannot be copyrighted. This framework introduces a shift from post-infringement remedies to preventive legal structures, where developers actively mitigate risks. The paper also breaks new ground by addressing obligations surrounding the training data of large language models (LLMs). Currently, training data is often treated under exceptions such as the EU’s text and data mining provisions or U.S. fair use. However, this paper proposes a proactive framework where developers are obligated to verify and document the legal status of their training data, ensuring it is licensed or otherwise cleared for use. In conclusion, this paper advocates for an obligations-centered model that shifts AI-related copyright law from reactive litigation to proactive design. By holding AI developers to a heightened standard of care, this approach aims to prevent infringement at its source, addressing both the outputs of AI systems and the training processes that underlie them.

Keywords: ip, technology, copyright, data, infringement, comparative analysis

Procedia PDF Downloads 23
9615 Robustness Analysis of the Carbon and Nitrogen Co-Metabolism Model of Mucor mucedo

Authors: Nahid Banihashemi

Abstract:

An emerging important area of the life sciences is systems biology, which involves understanding the integrated behavior of large numbers of components interacting via non-linear reaction terms. A centrally important problem in this area is an understanding of the co-metabolism of protein and carbohydrate, as it has been clearly demonstrated that the ratio of these metabolites in diet is a major determinant of obesity and related chronic disease. In this regard, we have considered a systems biology model for the co-metabolism of carbon and nitrogen in colonies of the fungus Mucor mucedo. Oscillations are an important diagnostic of underlying dynamical processes of this model. The maintenance of specific patterns of oscillation and its relation to the robustness of this system are the important issues which have been targeted in this paper. In this regard, parametric sensitivity approach as a theoretical approach has been considered for the analysis of the robustness of this model. As a result, the parameters of the model which produce the largest sensitivities have been identified. Furthermore, the largest changes that can be made in each parameter of the model without losing the oscillations in biomass production have been computed. The results are obtained from the implementation of parametric sensitivity analysis in Matlab.

Keywords: system biology, parametric sensitivity analysis, robustness, carbon and nitrogen co-metabolism, Mucor mucedo

Procedia PDF Downloads 333
9614 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack

Authors: Vincent Andrew Cappellano

Abstract:

In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.

Keywords: architecture, resiliency, availability, cyber-attack

Procedia PDF Downloads 115
9613 Disaster Management Supported by Unmanned Aerial Systems

Authors: Agoston Restas

Abstract:

Introduction: This paper describes many initiatives and shows also practical examples which happened recently using Unmanned Aerial Systems (UAS) to support disaster management. Since the operation of manned aircraft at disasters is usually not only expensive but often impossible to use as well, in many cases managers fail to use the aerial activity. UAS can be an alternative moreover cost-effective solution for supporting disaster management. Methods: This article uses thematic division of UAS applications; it is based on two key elements, one of them is the time flow of managing disasters, other is its tactical requirements. Logically UAS can be used like pre-disaster activity, activity immediately after the occurrence of a disaster and the activity after the primary disaster elimination. Paper faces different disasters, like dangerous material releases, floods, earthquakes, forest fires and human-induced disasters. Research used function analysis, practical experiments, mathematical formulas, economic analysis and also expert estimation. Author gathered international examples and used own experiences in this field as well. Results and discussion: An earthquake is a rapid escalating disaster, where, many times, there is no other way for a rapid damage assessment than aerial reconnaissance. For special rescue teams, the UAS application can help much in a rapid location selection, where enough place remained to survive for victims. Floods are typical for a slow onset disaster. In contrast, managing floods is a very complex and difficult task. It requires continuous monitoring of dykes, flooded and threatened areas. UAS can help managers largely keeping an area under observation. Forest fires are disasters, where the tactical application of UAS is already well developed. It can be used for fire detection, intervention monitoring and also for post-fire monitoring. In case of nuclear accident or hazardous material leakage, UAS is also a very effective or can be the only one tool for supporting disaster management. Paper shows some efforts using UAS to avoid human-induced disasters in low-income countries as part of health cooperation.

Keywords: disaster management, floods, forest fires, Unmanned Aerial Systems

Procedia PDF Downloads 242
9612 Collaborative Management Approach for Logistics Flow Management of Cuban Medicine Supply Chain

Authors: Ana Julia Acevedo Urquiaga, Jose A. Acevedo Suarez, Ana Julia Urquiaga Rodriguez, Neyfe Sablon Cossio

Abstract:

Despite the progress made in logistics and supply chains fields, it is unavoidable the development of business models that use efficiently information to facilitate the integrated logistics flows management between partners. Collaborative management is an important tool for materializing the cooperation between companies, as a way to achieve the supply chain efficiency and effectiveness. The first face of this research was a comprehensive analysis of the collaborative planning on the Cuban companies. It is evident that they have difficulties in supply chains planning where production, supplies and replenishment planning are independent tasks, as well as logistics and distribution operations. Large inventories generate serious financial and organizational problems for entities, demanding increasing levels of working capital that cannot be financed. Problems were found in the efficient application of Information and Communication Technology on business management. The general objective of this work is to develop a methodology that allows the deployment of a planning and control system in a coordinated way on the medicine’s logistics system in Cuba. To achieve these objectives, several mechanisms of supply chain coordination, mathematical programming models, and other management techniques were analyzed to meet the requirements of collaborative logistics management in Cuba. One of the findings is the practical and theoretical inadequacies of the studied models to solve the current situation of the Cuban logistics systems management. To contribute to the tactical-operative management of logistics, the Collaborative Logistics Flow Management Model (CLFMM) is proposed as a tool for the balance of cycles, capacities, and inventories, always to meet the final customers’ demands in correspondence with the service level expected by these. The CLFMM has as center the supply chain planning and control system as a unique information system, which acts on the processes network. The development of the model is based on the empirical methods of analysis-synthesis and the study cases. Other finding is the demonstration of the use of a single information system to support the supply chain logistics management, allows determining the deadlines and quantities required in each process. This ensures that medications are always available to patients and there are no faults that put the population's health at risk. The simulation of planning and control with the CLFMM in medicines such as dipyrone and chlordiazepoxide, during 5 months of 2017, permitted to take measures to adjust the logistic flow, eliminate delayed processes and avoid shortages of the medicines studied. As a result, the logistics cycle efficiency can be increased to 91%, the inventory rotation would increase, and this results in a release of financial resources.

Keywords: collaborative management, medicine logistic system, supply chain planning, tactical-operative planning

Procedia PDF Downloads 180
9611 Effects of Free-Hanging Horizontal Sound Absorbers on the Cooling Performance of Thermally Activated Building Systems

Authors: L. Marcos Domínguez, Nils Rage, Ongun B. Kazanci, Bjarne W. Olesen

Abstract:

Thermally Activated Building Systems (TABS) have proven to be an energy-efficient solution to provide buildings with an optimal indoor thermal environment. This solution uses the structure of the building to store heat, reduce the peak loads, and decrease the primary energy demand. TABS require the heated or cooled surfaces to be as exposed as possible to the indoor space, but exposing the bare concrete surfaces has a diminishing effect on the acoustic qualities of the spaces in a building. Acoustic solutions capable of providing optimal acoustic comfort and allowing the heat exchange between the TABS and the room are desirable. In this study, the effects of free-hanging units on the cooling performance of TABS and the occupants’ thermal comfort was measured in a full-scale TABS laboratory. Investigations demonstrate that the use of free-hanging sound absorbers are compatible with the performance of TABS and the occupant’s thermal comfort, but an appropriate acoustic design is needed to find the most suitable solution for each case. The results show a reduction of 11% of the cooling performance of the TABS when 43% of the ceiling area is covered with free-hanging horizontal sound absorbers, of 23% for 60% ceiling coverage ratio and of 36% for 80% coverage. Measurements in actual buildings showed an increase of the room operative temperature of 0.3 K when 50% of the ceiling surface is covered with horizontal panels and of 0.8 to 1 K for a 70% coverage ratio. According to numerical simulations using a new TRNSYS Type, the use of comfort ventilation has a considerable influence on the thermal conditions in the room; if the ventilation is removed, then the operative temperature increases by 1.8 K for a 60%-covered ceiling.

Keywords: acoustic comfort, concrete core activation, full-scale measurements, thermally activated building systems, TRNSys

Procedia PDF Downloads 331
9610 Segregation Patterns of Trees and Grass Based on a Modified Age-Structured Continuous-Space Forest Model

Authors: Jian Yang, Atsushi Yagi

Abstract:

Tree-grass coexistence system is of great importance for forest ecology. Mathematical models are being proposed to study the dynamics of tree-grass coexistence and the stability of the systems. However, few of the models concentrates on spatial dynamics of the tree-grass coexistence. In this study, we modified an age-structured continuous-space population model for forests, obtaining an age-structured continuous-space population model for the tree-grass competition model. In the model, for thermal competitions, adult trees can out-compete grass, and grass can out-compete seedlings. We mathematically studied the model to make sure tree-grass coexistence solutions exist. Numerical experiments demonstrated that a fraction of area that trees or grass occupies can affect whether the coexistence is stable or not. We also tried regulating the mortality of adult trees with other parameters and the fraction of area trees and grass occupies were fixed; results show that the mortality of adult trees is also a factor affecting the stability of the tree-grass coexistence in this model.

Keywords: population-structured models, stabilities of ecosystems, thermal competitions, tree-grass coexistence systems

Procedia PDF Downloads 166
9609 A High-Level Co-Evolutionary Hybrid Algorithm for the Multi-Objective Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a hybrid distributed algorithm has been suggested for the multi-objective job shop scheduling problem. Many new approaches are used at design steps of the distributed algorithm. Co-evolutionary structure of the algorithm and competition between different communicated hybrid algorithms, which are executed simultaneously, causes to efficient search. Using several machines for distributing the algorithms, at the iteration and solution levels, increases computational speed. The proposed algorithm is able to find the Pareto solutions of the big problems in shorter time than other algorithm in the literature. Apache Spark and Hadoop platforms have been used for the distribution of the algorithm. The suggested algorithm and implementations have been compared with results of the successful algorithms in the literature. Results prove the efficiency and high speed of the algorithm.

Keywords: distributed algorithms, Apache Spark, Hadoop, job shop scheduling, multi-objective optimization

Procedia PDF Downloads 367
9608 Polynomially Adjusted Bivariate Density Estimates Based on the Saddlepoint Approximation

Authors: S. B. Provost, Susan Sheng

Abstract:

An alternative bivariate density estimation methodology is introduced in this presentation. The proposed approach involves estimating the density function associated with the marginal distribution of each of the two variables by means of the saddlepoint approximation technique and applying a bivariate polynomial adjustment to the product of these density estimates. Since the saddlepoint approximation is utilized in the context of density estimation, such estimates are determined from empirical cumulant-generating functions. In the univariate case, the saddlepoint density estimate is itself adjusted by a polynomial. Given a set of observations, the coefficients of the polynomial adjustments are obtained from the sample moments. Several illustrative applications of the proposed methodology shall be presented. Since this approach relies essentially on a determinate number of sample moments, it is particularly well suited for modeling massive data sets.

Keywords: density estimation, empirical cumulant-generating function, moments, saddlepoint approximation

Procedia PDF Downloads 282
9607 Statistical Physics Model of Seismic Activation Preceding a Major Earthquake

Authors: Daniel S. Brox

Abstract:

Starting from earthquake fault dynamic equations, a correspondence between earthquake occurrence statistics in a seismic region before a major earthquake and eigenvalue statistics of a differential operator whose bound state eigenfunctions characterize the distribution of stress in the seismic region is derived. Modeling these eigenvalue statistics with a 2D Coulomb gas statistical physics model, previously reported deviation of seismic activation earthquake occurrence statistics from Gutenberg-Richter statistics in time intervals preceding the major earthquake is derived. It also explains how statistical physics modeling predicts a finite-dimensional nonlinear dynamic system that describes real-time velocity model evolution in the region undergoing seismic activation and how this prediction can be tested experimentally.

Keywords: seismic activation, statistical physics, geodynamics, signal processing

Procedia PDF Downloads 26
9606 Ecosystem Services Assessment for Urban Nature-Based Solutions Implemented in the Public Space: Case Study of Alhambra Square in Bogotá, Colombia

Authors: Diego Sánchez, Sandra M. Aguilar, José F. Gómez, Gustavo Montaño, Laura P. Otero, Carlos V. Rey, José A. Martínez, Juliana Robles, Jorge E. Burgos, Juan S. López

Abstract:

Bogota is making efforts towards urban resilience through Nature-based Solutions (NbS) incorporation in public projects as a climate change resilience strategy. The urban renovation project on the Alhambra square includes Green Infrastructure (GI), like Sustainable Urban Drainage Systems (SUDS) and Urban Trees (UT), as ecosystem services (ES) boosters. This study analyzes 3 scenarios: (1) the initial situation without NbS, (2) the expected situation including NbS in the design and (3) the projection of the second one after 30 years, calculating the ecosystem services, the stormwater management benefits provided by SUDS and the cultural services. The obtained results contribute to the understanding of the urban NbS benefits in public spaces, providing valuable information to foster investment in sustainable projects and encouraging policy makers to integrate NbS in urban planning.

Keywords: ecosystem services, nature-based solutions, stormwater management, sustainable urban drainage systems

Procedia PDF Downloads 164
9605 Influence of Vegetable Oil-Based Controlled Cutting Fluid Impinging Supply System on Micro Hardness in Machining of Ti-6Al-4V

Authors: Salah Gariani, Islam Shyha, Fawad Inam, Dehong Huo

Abstract:

A controlled cutting fluid impinging supply system (CUT-LIST) was developed to deliver an accurate amount of cutting fluid into the machining zone via well-positioned coherent nozzles based on a calculation of the heat generated. The performance of the CUT-LIST was evaluated against a conventional flood cutting fluid supply system during step shoulder milling of Ti-6Al-4V using vegetable oil-based cutting fluid. In this paper, the micro-hardness of the machined surface was used as the main criterion to compare the two systems. CUT-LIST provided significant reductions in cutting fluid consumption (up to 42%). Both systems caused increased micro-hardness value at 100 µm from the machined surface, whereas a slight reduction in micro-hardness of 4.5% was measured when using CUL-LIST. It was noted that the first 50 µm is the soft sub-surface promoted by thermal softening, whereas down to 100 µm is the hard sub-surface caused by the cyclic internal work hardening and then gradually decreased until it reached the base material nominal hardness. It can be concluded that the CUT-LIST has always given lower micro-hardness values near the machined surfaces in all conditions investigated.

Keywords: impinging supply system, micro-hardness, shoulder milling, Ti-6Al-4V, vegetable oil-based cutting fluid

Procedia PDF Downloads 289
9604 Hybrid Precoder Design Based on Iterative Hard Thresholding Algorithm for Millimeter Wave Multiple-Input-Multiple-Output Systems

Authors: Ameni Mejri, Moufida Hajjaj, Salem Hasnaoui, Ridha Bouallegue

Abstract:

The technology advances have most lately made the millimeter wave (mmWave) communication possible. Due to the huge amount of spectrum that is available in MmWave frequency bands, this promising candidate is considered as a key technology for the deployment of 5G cellular networks. In order to enhance system capacity and achieve spectral efficiency, very large antenna arrays are employed at mmWave systems by exploiting array gain. However, it has been shown that conventional beamforming strategies are not suitable for mmWave hardware implementation. Therefore, new features are required for mmWave cellular applications. Unlike traditional multiple-input-multiple-output (MIMO) systems for which only digital precoders are essential to accomplish precoding, MIMO technology seems to be different at mmWave because of digital precoding limitations. Moreover, precoding implements a greater number of radio frequency (RF) chains supporting more signal mixers and analog-to-digital converters. As RF chain cost and power consumption is increasing, we need to resort to another alternative. Although the hybrid precoding architecture has been regarded as the best solution based on a combination between a baseband precoder and an RF precoder, we still do not get the optimal design of hybrid precoders. According to the mapping strategies from RF chains to the different antenna elements, there are two main categories of hybrid precoding architecture. Given as a hybrid precoding sub-array architecture, the partially-connected structure reduces hardware complexity by using a less number of phase shifters, whereas it sacrifices some beamforming gain. In this paper, we treat the hybrid precoder design in mmWave MIMO systems as a problem of matrix factorization. Thus, we adopt the alternating minimization principle in order to solve the design problem. Further, we present our proposed algorithm for the partially-connected structure, which is based on the iterative hard thresholding method. Through simulation results, we show that our hybrid precoding algorithm provides significant performance gains over existing algorithms. We also show that the proposed approach reduces significantly the computational complexity. Furthermore, valuable design insights are provided when we use the proposed algorithm to make simulation comparisons between the hybrid precoding partially-connected structure and the fully-connected structure.

Keywords: alternating minimization, hybrid precoding, iterative hard thresholding, low-complexity, millimeter wave communication, partially-connected structure

Procedia PDF Downloads 326
9603 Digital Geography and Geographic Information System in Schools: Towards a Hierarchical Geospatial Approach

Authors: Mary Fargher

Abstract:

This paper examines the opportunities of using a more hierarchical approach to geospatial enquiry in using GIS in school geography. A case is made that it is not just the lack of teacher technological knowledge that is stopping some teachers from using GIS in the classroom but that there is a gap in their understanding of how to link GIS use more specifically to the pedagogy of teaching geography with GIS. Using a hierarchical approach to geospatial enquiry as a theoretical framework, the analysis shows clearly how concepts of spatial distribution, interaction, relation, comparison, and temporal relationships can be used by teachers more explicitly to capitalise on the analytical power of GIS and to construct what can be interpreted as powerful geographical knowledge. An exemplar illustrating this approach on the topic of geo-hazards is then presented for critical analysis and discussion. Recommendations are then made for a model of progression for geography teacher education with GIS through hierarchical geospatial enquiry that takes into account beginner, intermediate, and more advanced users.

Keywords: digital geography, GIS, education, hierarchical geospatial enquiry, powerful geographical knowledge

Procedia PDF Downloads 157
9602 Application of Modal Analysis for Commissioning of a Ball Screw System

Authors: T. D. Tran, H. Schlegel, R. Neugebauer

Abstract:

Ball screws are an important component in machine tools. In mechatronic systems and machine tools, a ball screw has to work usually at a high speed. Otherwise the axial compliance of the ball screw, in combination with the inertia of the slide, the motor, the coupling and the screw, will cause an oscillation resonance, which limits the systems bandwidth and consequently influences performance of the motion controller. In this paper, the modal analysis method by measuring and analysing the vibrating parameters of the ball screw system to determine the dynamic characteristic of existing structures is used. On the one hand, the results of this study were obtained by the theoretical analysis and the modal testing of a ball screw system test station with the help of an impact hammer, respectively using excitation by motor. The experimental study showed oscillating forms of the ball screw for each frequency and obtained eigenfrequencies of the ball screw system. On the other hand, in this research a simulation with the help of the numerical modal analysis in order to analyse the oscillation and to find the eigenfrequencies of the ball screw system is used. Furthermore, the model order reduction by modal reduction and also according to Guyan is carried out. On the basis of these results a secure and also rapid commissioning of the control loops with regard to operating in their optimal function is targeted.

Keywords: modal analysis, ball screw, controller system, machine tools

Procedia PDF Downloads 466
9601 Effect of Composition Fuel on Safety of Combustion Process

Authors: Lourdes I. Meriño, Viatcheslav Kafarov, Maria Gómez

Abstract:

Fuel gas used in the burner receives as contributors other gases from different processes and this result in variability in the composition, which may cause an incomplete combustion. The burners are designed to operate in a certain curve, the calorific power dependent on the pressure and gas burners. When deviation of propane and C5+ is huge, there is a large release of energy, which causes it to work out the curves of the burners, because less pressure is required to force curve into operation. That increases the risk of explosion in an oven, besides of a higher environmental impact. There should be flame detection systems, and instrumentation equipment, such as local pressure gauges located at the entrance of the gas burners, to permit verification by the operator. Additionally, distributed control systems must be configured with different combustion instruments associated with respective alarms, as well as its operational windows, and windows control guidelines of integrity, leaving the design information of this equipment. Therefore, it is desirable to analyze when a plant is taken out of service and make good operational analysis to determine the impact of changes in fuel gas streams contributors, by varying the calorific power. Hence, poor combustion is one of the cause instability in the flame of the burner and having a great impact on process safety, the integrity of individuals and teams and environment.

Keywords: combustion process, fuel composition, safety, fuel gas

Procedia PDF Downloads 491
9600 Skills Development: The Active Learning Model of a French Computer Science Institute

Authors: N. Paparisteidi, D. Rodamitou

Abstract:

This article focuses on the skills development and path planning of students studying computer science in EPITECH: french private institute of Higher Education. The researchers examine students’ points of view and experience in a blended learning model based on a skills development curriculum. The study is based on the collection of four main categories of data: semi-participant observation, distribution of questionnaires, interviews, and analysis of internal school databases. The findings seem to indicate that a skills-based program on active learning enables students to develop their learning strategies as well as their personal skills and to actively engage in the creation of their career path and contribute to providing additional information to curricula planners and decision-makers about learning design in higher education.

Keywords: active learning, blended learning, higher education, skills development

Procedia PDF Downloads 107