Search results for: neuromorphic computing systems
7246 Prime Mover Sizing for Base-Loaded Combined Heating and Power Systems
Authors: Djalal Boualili
Abstract:
This article considers the problem of sizing prime movers for combined heating and power (CHP) systems operating at full load to satisfy a fraction of a facility's electric load, i.e. a base load. Prime mover sizing is examined using three criteria: operational cost, carbon dioxide emissions (CDE), and primary energy consumption (PEC). The sizing process leads to consider ratios of conversion factors applied to imported electricity to conversion factors applied to fuel consumed. These ratios are labelled RCost, R CDE, R PEC depending on whether the conversion factors are associated with operational cost, CDE, or PEC, respectively. Analytical results show that in order to achieve savings in operational cost, CDE, or PEC, the ratios must be larger than a unique constant R Min that only depends on the CHP components efficiencies. Savings in operational cost, CDE, or PEC due to CHP operation are explicitly formulated using simple equations. This facilitates the process of comparing the tradeoffs of optimizing the savings of one criterion over the other two – a task that has traditionally been accomplished through computer simulations. A hospital building, located in Chlef, Algeria, was used as an example to apply the methodology presented in this article.Keywords: sizing, heating and power, ratios, energy consumption, carbon dioxide emissions
Procedia PDF Downloads 2317245 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic
Authors: Budoor Al Abid
Abstract:
Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.Keywords: machine learning, adaptive, fuzzy logic, data mining
Procedia PDF Downloads 1967244 On the Implementation of The Pulse Coupled Neural Network (PCNN) in the Vision of Cognitive Systems
Authors: Hala Zaghloul, Taymoor Nazmy
Abstract:
One of the great challenges of the 21st century is to build a robot that can perceive and act within its environment and communicate with people, while also exhibiting the cognitive capabilities that lead to performance like that of people. The Pulse Coupled Neural Network, PCNN, is a relative new ANN model that derived from a neural mammal model with a great potential in the area of image processing as well as target recognition, feature extraction, speech recognition, combinatorial optimization, compressed encoding. PCNN has unique feature among other types of neural network, which make it a candid to be an important approach for perceiving in cognitive systems. This work show and emphasis on the potentials of PCNN to perform different tasks related to image processing. The main drawback or the obstacle that prevent the direct implementation of such technique, is the need to find away to control the PCNN parameters toward perform a specific task. This paper will evaluate the performance of PCNN standard model for processing images with different properties, and select the important parameters that give a significant result, also, the approaches towards find a way for the adaptation of the PCNN parameters to perform a specific task.Keywords: cognitive system, image processing, segmentation, PCNN kernels
Procedia PDF Downloads 2807243 Threat Modeling Methodology for Supporting Industrial Control Systems Device Manufacturers and System Integrators
Authors: Raluca Ana Maria Viziteu, Anna Prudnikova
Abstract:
Industrial control systems (ICS) have received much attention in recent years due to the convergence of information technology (IT) and operational technology (OT) that has increased the interdependence of safety and security issues to be considered. These issues require ICS-tailored solutions. That led to the need to creation of a methodology for supporting ICS device manufacturers and system integrators in carrying out threat modeling of embedded ICS devices in a way that guarantees the quality of the identified threats and minimizes subjectivity in the threat identification process. To research, the possibility of creating such a methodology, a set of existing standards, regulations, papers, and publications related to threat modeling in the ICS sector and other sectors was reviewed to identify various existing methodologies and methods used in threat modeling. Furthermore, the most popular ones were tested in an exploratory phase on a specific PLC device. The outcome of this exploratory phase has been used as a basis for defining specific characteristics of ICS embedded devices and their deployment scenarios, identifying the factors that introduce subjectivity in the threat modeling process of such devices, and defining metrics for evaluating the minimum quality requirements of identified threats associated to the deployment of the devices in existing infrastructures. Furthermore, the threat modeling methodology was created based on the previous steps' results. The usability of the methodology was evaluated through a set of standardized threat modeling requirements and a standardized comparison method for threat modeling methodologies. The outcomes of these verification methods confirm that the methodology is effective. The full paper includes the outcome of research on different threat modeling methodologies that can be used in OT, their comparison, and the results of implementing each of them in practice on a PLC device. This research is further used to build a threat modeling methodology tailored to OT environments; a detailed description is included. Moreover, the paper includes results of the evaluation of created methodology based on a set of parameters specifically created to rate threat modeling methodologies.Keywords: device manufacturers, embedded devices, industrial control systems, threat modeling
Procedia PDF Downloads 807242 Climate Smart Agriculture: Nano Technology in Solar Drying
Authors: Figen Kadirgan, M. A. Neset Kadirgan, Gokcen A. Ciftcioglu
Abstract:
Addressing food security and climate change challenges have to be done in an integrated manner. To increase food production and to reduce emissions intensity, thus contributing to mitigate climate change, food systems have to be more efficient in the use of resources. To ensure food security and adapt to climate change they have to become more resilient. The changes required in agricultural and food systems will require the creation of supporting institutions and enterprises to provide services and inputs to smallholders, fishermen and pastoralists, and transform and commercialize their production more efficiently. Thus there is continously growing need to switch to green economy where simultaneously causes reduction in carbon emissions and pollution, enhances energy and resource-use efficiency; and prevents the loss of biodiversity and ecosystem services. Smart Agriculture takes into account the four dimensions of food security, availability, accessibility, utilization, and stability. It is well known that, the increase in world population will strengthen the population-food imbalance. The emphasis on reduction of food losses makes a point on production, on farmers, on increasing productivity and income ensuring food security. Where also small farmers enhance their income and stabilize their budget. The use of solar drying for agricultural, marine or meat products is very important for preservation. Traditional sun drying is a relatively slow process where poor food quality is seen due to an infestation of insects, enzymatic reactions, microorganism growth and micotoxin development. In contrast, solar drying has a sound solution to all these negative effects of natural drying and artificial mechanical drying. The technical directions in the development of solar drying systems for agricultural products are compact collector design with high efficiency and low cost. In this study, using solar selective surface produced in Selektif Teknoloji Co. Inc. Ltd., solar dryers with high efficiency will be developed and a feasibility study will be realized.Keywords: energy, renewable energy, solar collector, solar drying
Procedia PDF Downloads 2257241 The Use of PD and Tanδ Characteristics as Diagnostic Technique for the Insulation Integrity of XLPE Insulated Cable Joints
Authors: Mazen Al-Bulaihed, Nissar Wani, Abdulrahman Al-Arainy, Yasin Khan
Abstract:
Partial Discharge (PD) measurements are widely used for diagnostic purposes in electrical equipment used in power systems. The main cause of these measurements is to prevent large power failures as cables are prone to aging, which usually results in embrittlement, cracking and eventual failure of the insulating and sheathing materials, exposing the conductor and risking a potential short circuit, a likely cause of the electrical fire. Many distribution networks rely heavily on medium voltage (MV) power cables. The presence of joints in these networks is a vital part of serving the consumer demand for electricity continuously. Such measurements become even more important when the extent of dependence increases. Moreover, it is known that the partial discharge in joints and termination are difficult to track and are the most crucial point of failures in large power systems. This paper discusses the diagnostic techniques of four samples of XLPE insulated cable joints, each included with a different type of defect. Experiments were carried out by measuring PD and tanδ at very low frequency applied high voltage. The results show the importance of combining PD and tanδ for effective cable assessment.Keywords: partial discharge, tan delta, very low frequency, XLPE cable
Procedia PDF Downloads 1637240 FLIME - Fast Low Light Image Enhancement for Real-Time Video
Authors: Vinay P., Srinivas K. S.
Abstract:
Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.Keywords: low light image enhancement, real-time video, computer vision, machine learning
Procedia PDF Downloads 2067239 Integration of PV Systems in Residential Buildings: A Solution for Supporting Electrical Grid in Kuwait
Authors: Nabil A. Ahmed, Nasser A. N. Mhaisen
Abstract:
The paper presents a solution to enhance the power quality and to reduce the peak load demand in Kuwait electric grid as a solution to the shortage of electricity production. Technical, environmental and economic feasibility study of utilizing integrated grid-connected photovoltaic (PV) system in residential buildings for supplying 7.1% of electrical power consumption in Kuwait is carried out using RETScreen software. A 10 KWp on-grid PV power generation system spread on the rooftop of the residential buildings is adopted and investigated and the complete system performance is simulated using PSIM software. Taking into account the international prices of electricity and natural gas, the proposed solution is investigated and tested for four different types of installation systems in terms of power generation and costs which includes horizontal installation, 25º tilted angle, single axis tracking and dual axis tracking. Results shows that the 25º tilted angle fixed mounted system is the most efficient type. The payback period as a tool of benefit analysis of the proposed system is calculated and it found to be 2.55 years.Keywords: photovoltaics, residential buildings, electrical grid, production capacity, on-grid, power generation
Procedia PDF Downloads 4947238 Combined Safety and Cybersecurity Risk Assessment for Intelligent Distributed Grids
Authors: Anders Thorsén, Behrooz Sangchoolie, Peter Folkesson, Ted Strandberg
Abstract:
As more parts of the power grid become connected to the internet, the risk of cyberattacks increases. To identify the cybersecurity threats and subsequently reduce vulnerabilities, the common practice is to carry out a cybersecurity risk assessment. For safety classified systems and products, there is also a need for safety risk assessments in addition to the cybersecurity risk assessment in order to identify and reduce safety risks. These two risk assessments are usually done separately, but since cybersecurity and functional safety are often related, a more comprehensive method covering both aspects is needed. Some work addressing this has been done for specific domains like the automotive domain, but more general methods suitable for, e.g., intelligent distributed grids, are still missing. One such method from the automotive domain is the Security-Aware Hazard Analysis and Risk Assessment (SAHARA) method that combines safety and cybersecurity risk assessments. This paper presents an approach where the SAHARA method has been modified in order to be more suitable for larger distributed systems. The adapted SAHARA method has a more general risk assessment approach than the original SAHARA. The proposed method has been successfully applied on two use cases of an intelligent distributed grid.Keywords: intelligent distribution grids, threat analysis, risk assessment, safety, cybersecurity
Procedia PDF Downloads 1537237 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 1827236 Constructions of Linear and Robust Codes Based on Wavelet Decompositions
Authors: Alla Levina, Sergey Taranov
Abstract:
The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability
Procedia PDF Downloads 4897235 Spatially Distributed Rainfall Prediction Based on Automated Kriging for Landslide Early Warning Systems
Authors: Ekrem Canli, Thomas Glade
Abstract:
The precise prediction of rainfall in space and time is a key element to most landslide early warning systems. Unfortunately, the spatial variability of rainfall in many early warning applications is often disregarded. A common simplification is to use uniformly distributed rainfall to characterize aerial rainfall intensity. With spatially differentiated rainfall information, real-time comparison with rainfall thresholds or the implementation in process-based approaches might form the basis for improved landslide warnings. This study suggests an automated workflow from the hourly, web-based collection of rain gauge data to the generation of spatially differentiated rainfall predictions based on kriging. Because the application of kriging is usually a labor intensive task, a simplified and consequently automated variogram modeling procedure was applied to up-to-date rainfall data. The entire workflow was carried out purely with open source technology. Validation results, albeit promising, pointed out the challenges that are involved in pure distance based, automated geostatistical interpolation techniques for ever-changing environmental phenomena over short temporal and spatial extent.Keywords: kriging, landslide early warning system, spatial rainfall prediction, variogram modelling, web scraping
Procedia PDF Downloads 2807234 Developing a Green Information Technology Model in Australian Higher-Educational Institutions
Authors: Mahnaz Jafari, Parisa Izadpanahi, Francesco Mancini, Muhammad Qureshi
Abstract:
The advancement in Information Technology (IT) has been an intrinsic element in the developments of the 21st century bringing benefits such as increased economic productivity. However, its widespread application has also been associated with inadvertent negative impacts on society and the environment necessitating selective interventions to mitigate these impacts. This study responded to this need by developing a Green IT Rating Tool (GIRT) for higher education institutions (HEI) in Australia to evaluate the sustainability of IT-related practices from an environmental, social, and economic perspective. Each dimension must be considered equally to achieve sustainability. The development of the GIRT was informed by the views of interviewed IT professionals whose opinions formed the basis of a framework listing Green IT initiatives in order of their importance as perceived by the interviewed professionals. This framework formed the base of the GIRT, which identified Green IT initiatives (such as videoconferencing as a substitute for long-distance travel) and the associated weighting of each practice. The proposed sustainable Green IT model could be integrated into existing IT systems, leading to significant reductions in carbon emissions and e-waste and improvements in energy efficiency. The development of the GIRT and the findings of this study have the potential to inspire other organizations to adopt sustainable IT practices, positively impact the environment, and be used as a reference by IT professionals and decision-makers to evaluate IT-related sustainability practices. The GIRT could also serve as a benchmark for HEIs to compare their performance with other institutions and to track their progress over time. Additionally, the study's results suggest that virtual and cloud-based technologies could reduce e-waste and energy consumption in the higher education sector. Overall, this study highlights the importance of incorporating Green IT practices into the IT systems of HEI to contribute to a more sustainable future.Keywords: green information technology, international higher-educational institution, sustainable solutions, environmentally friendly IT systems
Procedia PDF Downloads 767233 The Effect of Implant Design on the Height of Inter-Implant Bone Crest: A 10-Year Retrospective Study of the Astra Tech Implant and Branemark Implant
Authors: Daeung Jung
Abstract:
Background: In case of patients with missing teeth, multiple implant restoration has been widely used and is inevitable. To increase its survival rate, it is important to understand the influence of different implant designs on inter-implant crestal bone resorption. There are several implant systems designed to minimize loss of crestal bone, and the Astra Tech and Brånemark Implant are two of them. Aim/Hypothesis: The aim of this 10-year study was to compare the height of inter-implant bone crest in two implant systems; the Astra Tech and the Brånemark implant system. Material and Methods: In this retrospective study, 40 consecutively treated patients were utilized; 23 patients with 30 sites for Astra Tech system and 17 patients with 20 sites for Brånemark system. The implant restoration was comprised of splinted crown in partially edentulous patients. Radiographs were taken immediately after 1st surgery, at impression making, at prosthetics setting, and annually after loading. Lateral distance from implant to bone crest, inter-implant distance was gauged, and crestal bone height was measured from the implant shoulder to the first bone contact. Calibrations were performed with known length of thread pitch distance for vertical measurement, and known diameter of abutment or fixture for horizontal measurement using ImageJ. Results: After 10 years, patients treated with Astra Tech implant system demonstrated less inter-implant crestal bone resorption when implants had a distance of 3mm or less between them. In cases of implants that had a greater than 3 mm distance between them, however, there appeared to be no statistically significant difference in crestal bone loss between two systems. Conclusion and clinical implications: In the situation of partially edentulous patients planning to have more than two implants, the inter-implant distance is one of the most important factors to be considered. If it is impossible to make sure of having sufficient inter-implant distance, the implants with less micro gap in the fixture-abutment junction, less traumatic 2nd surgery approach, and the adequate surface topography would be choice of appropriate options to minimize inter-implant crestal bone resorption.Keywords: implant design, crestal bone loss, inter-implant distance, 10-year retrospective study
Procedia PDF Downloads 1667232 Enhancing Robustness in Federated Learning through Decentralized Oracle Consensus and Adaptive Evaluation
Authors: Peiming Li
Abstract:
This paper presents an innovative blockchain-based approach to enhance the reliability and efficiency of federated learning systems. By integrating a decentralized oracle consensus mechanism into the federated learning framework, we address key challenges of data and model integrity. Our approach utilizes a network of redundant oracles, functioning as independent validators within an epoch-based training system in the federated learning model. In federated learning, data is decentralized, residing on various participants' devices. This scenario often leads to concerns about data integrity and model quality. Our solution employs blockchain technology to establish a transparent and tamper-proof environment, ensuring secure data sharing and aggregation. The decentralized oracles, a concept borrowed from blockchain systems, act as unbiased validators. They assess the contributions of each participant using a Hidden Markov Model (HMM), which is crucial for evaluating the consistency of participant inputs and safeguarding against model poisoning and malicious activities. Our methodology's distinct feature is its epoch-based training. An epoch here refers to a specific training phase where data is updated and assessed for quality and relevance. The redundant oracles work in concert to validate data updates during these epochs, enhancing the system's resilience to security threats and data corruption. The effectiveness of this system was tested using the Mnist dataset, a standard in machine learning for benchmarking. Results demonstrate that our blockchain-oriented federated learning approach significantly boosts system resilience, addressing the common challenges of federated environments. This paper aims to make these advanced concepts accessible, even to those with a limited background in blockchain or federated learning. We provide a foundational understanding of how blockchain technology can revolutionize data integrity in decentralized systems and explain the role of oracles in maintaining model accuracy and reliability.Keywords: federated learning system, block chain, decentralized oracles, hidden markov model
Procedia PDF Downloads 637231 The Time-Frequency Domain Reflection Method for Aircraft Cable Defects Localization
Authors: Reza Rezaeipour Honarmandzad
Abstract:
This paper introduces an aircraft cable fault detection and location method in light of TFDR keeping in mind the end goal to recognize the intermittent faults adequately and to adapt to the serial and after-connector issues being hard to be distinguished in time domain reflection. In this strategy, the correlation function of reflected and reference signal is used to recognize and find the airplane fault as per the qualities of reflected and reference signal in time-frequency domain, so the hit rate of distinguishing and finding intermittent faults can be enhanced adequately. In the work process, the reflected signal is interfered by the noise and false caution happens frequently, so the threshold de-noising technique in light of wavelet decomposition is used to diminish the noise interference and lessen the shortcoming alert rate. At that point the time-frequency cross connection capacity of the reference signal and the reflected signal based on Wigner-Ville appropriation is figured so as to find the issue position. Finally, LabVIEW is connected to execute operation and control interface, the primary capacity of which is to connect and control MATLAB and LABSQL. Using the solid computing capacity and the bottomless capacity library of MATLAB, the signal processing turn to be effortlessly acknowledged, in addition LabVIEW help the framework to be more dependable and upgraded effectively.Keywords: aircraft cable, fault location, TFDR, LabVIEW
Procedia PDF Downloads 4777230 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand
Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones
Abstract:
As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem
Procedia PDF Downloads 2487229 Micro-Droplet Formation in a Microchannel under the Effect of an Electric Field: Experiment
Authors: Sercan Altundemir, Pinar Eribol, A. Kerem Uguz
Abstract:
Microfluidics systems allow many-large scale laboratory applications to be miniaturized on a single device in order to reduce cost and advance fluid control. Moreover, such systems enable to generate and control droplets which have a significant role on improved analysis for many chemical and biological applications. For example, they can be employed as the model for cells in microfluidic systems. In this work, the interfacial instability of two immiscible Newtonian liquids flowing in a microchannel is investigated. When two immiscible liquids are in laminar regime, a flat interface is formed between them. If a direct current electric field is applied, the interface may deform, i.e. may become unstable and it may be ruptured and form micro-droplets. First, the effect of thickness ratio, total flow rate, viscosity ratio of the silicone oil and ethylene glycol liquid couple on the critical voltage at which the interface starts to destabilize is investigated. Then the droplet sizes are measured under the effect of these parameters at various voltages. Moreover, the effect of total flow rate on the time elapsed for the interface to be ruptured to form droplets by hitting the wall of the channel is analyzed. It is observed that an increase in the viscosity or the thickness ratio of the silicone oil to the ethylene glycol has a stabilizing effect, i.e. a higher voltage is needed while the total flow rate has no effect on it. However, it is observed that an increase in the total flow rate results in shortening of the elapsed time for the interface to hit the wall. Moreover, the droplet size decreases down to 0.1 μL with an increase in the applied voltage, the viscosity ratio or the total flow rate or a decrease in the thickness ratio. In addition to these observations, two empirical models for determining the critical electric number, i.e., the dimensionless voltage and the droplet size and another model which is a combination of both models, for determining the droplet size at the critical voltage are established.Keywords: droplet formation, electrohydrodynamics, microfluidics, two-phase flow
Procedia PDF Downloads 1767228 An Integral Sustainable Design Evaluation of the 15-Minute City and the Processes of Transferability to Cities of the Global South
Authors: Chitsanzo Isaac
Abstract:
Across the world, the ongoing Covid-19 pandemic has challenged urban systems and policy frameworks, highlighting societal vulnerabilities and systemic inequities among many communities. Measures of confinement and social distancing to contain the Covid-19 virus have fragmented the physical and social fabric of cities. This has caused urban dwellers to reassess how they engage with their urban surroundings and maintain social ties. Urbanists have presented strategies that would allow communities to survive and even thrive, in extraordinary times of crisis like the pandemic. Tactical Urbanism, particularly the 15-Minute City, has gained popularity. It is considered a resilient approach in the global north, however, it’s transferability to the global south has been called into question. To this end, this paper poses the question: to what extent is the 15-Minute City framework integral sustainable design, and are there processes that make it adoptable by cities in the global south? This paper explores four issues using secondary quantitative data analysis and convergence analysis in the Paris and Blantyre urban regions. First, it questions how the 15-Minute City has been defined and measured, and how it impacts urban dwellers. Second, it examines the extent to which the 15-minute city performs under the lens of frameworks such as Wilber’s integral theory and Fleming’s integral sustainable design theory. Thirdly this work examines the processes that can be transferred to developing cities which foster community resilience through the perspectives of experience, behaviors, cultures, and systems. Finally, it reviews the principal ways in which a multi-perspective reality can be the basis for resilient community design and sustainable urban development. This work will shed a light on the importance of a multi-perspective reality as a means of achieving sustainable urban design goals in developing urban areas.Keywords: 15-minute city, developing cities, global south, community resilience, integral sustainable design, systems thinking, complexity, tactical urbanism
Procedia PDF Downloads 1507227 Performance Analysis of High Temperature Heat Pump Cycle for Industrial Process
Authors: Seon Tae Kim, Robert Hegner, Goksel Ozuylasi, Panagiotis Stathopoulos, Eberhard Nicke
Abstract:
High-temperature heat pumps (HTHP) that can supply heat at temperatures above 200°C can enhance the energy efficiency of industrial processes and reduce the CO₂ emissions connected with the heat supply of these processes. In the current work, the thermodynamic performance of 3 different vapor compression cycles, which use R-718 (water) as a working medium, have been evaluated by using a commercial process simulation tool (EBSILON Professional). All considered cycles use two-stage vapor compression with intercooling between stages. The main aim of the study is to compare different intercooling strategies and study possible heat recovery scenarios within the intercooling process. This comparison has been carried out by computing the coefficient of performance (COP), the heat supply temperature level, and the respective mass flow rate of water for all cycle architectures. With increasing temperature difference between the heat source and heat sink, ∆T, the COP values decreased as expected, and the highest COP value was found for the cycle configurations where both compressors have the same pressure ratio (PR). The investigation on the HTHP capacities with optimized PR and exergy analysis has also been carried out. The internal heat exchanger cycle with the inward direction of secondary flow (IHX-in) showed a higher temperature level and exergy efficiency compared to other cycles. Moreover, the available operating range was estimated by considering mechanical limitations.Keywords: high temperature heat pump, industrial process, vapor compression cycle, R-718 (water), thermodynamic analysis
Procedia PDF Downloads 1497226 Study and Calibration of Autonomous UAV Systems With Thermal Sensing With Multi-purpose Roles
Authors: Raahil Sheikh, Prathamesh Minde, Priya Gujjar, Himanshu Dwivedi, Abhishek Maurya
Abstract:
UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoidedKeywords: UAV, autonomous systems, drones, geo thermal imaging
Procedia PDF Downloads 867225 A Review of Material and Methods Used in Liner Layers in Various Landfills
Authors: S. Taghvamanesh
Abstract:
Modern landfills are highly engineered containment systems that are designed to reduce the environmental and human health impacts of solid waste (trash). In modern landfills, waste is contained by a liner system. The primary goal of the liner system is to isolate the landfill contents from the environment, thereby protecting the soil and groundwater from pollution caused by the leachate of a landfill. Landfill leachate is the most serious threat to groundwater. Therefore, it is necessary to design a system that prevents the penetration of this dangerous substance into the environment. These layers are made up of two basic elements: clay and geosynthetics. Hydraulic conductivity and flexibility are two desirable properties of these materials. There are three different types of liner systems that will be discussed in this paper. According to available data, the current article analyzed materials and methods for constructing liner layers made of distinct leachates, including various harmful components and heavy metals from all around the world. Also, this study attempted to gather data on leachates for each of the sites discussed. In conclusion, every landfill requires a specific type of liner, which depends on the type of leachate that it produces daily. It should also be emphasized that, based on available data, this article focused on the number of landfills that each country or continent possesses.Keywords: landfill, liner layer, impervious layer, barrier layer
Procedia PDF Downloads 787224 Investigation of Processing Conditions on Rheological Features of Emulsion Gels and Oleogels Stabilized by Biopolymers
Authors: M. Sarraf, J. E. Moros, M. C. Sánchez
Abstract:
Oleogels are self-standing systems that are able to trap edible liquid oil into a tridimensional network and also help to use less fat by forming crystallization oleogelators. There are different ways to generate oleogelation and oil structuring, including direct dispersion, structured biphasic systems, oil sorption, and indirect method (emulsion-template). The selection of processing conditions as well as the composition of the oleogels is essential to obtain a stable oleogel with characteristics suitable for its purpose. In this sense, one of the ingredients widely used in food products to produce oleogels and emulsions is polysaccharides. Basil seed gum (BSG), with the scientific name Ocimum basilicum, is a new native polysaccharide with high viscosity and pseudoplastic behavior because of its high molecular weight in the food industry. Also, proteins can stabilize oil in water due to the presence of amino and carboxyl moieties that result in surface activity. Whey proteins are widely used in the food industry due to available, cheap ingredients, nutritional and functional characteristics such as emulsifier and a gelling agent, thickening, and water-binding capacity. In general, the interaction of protein and polysaccharides has a significant effect on the food structures and their stability, like the texture of dairy products, by controlling the interactions in macromolecular systems. Using edible oleogels as oil structuring helps for targeted delivery of a component trapped in a structural network. Therefore, the development of efficient oleogel is essential in the food industry. A complete understanding of the important points, such as the ratio oil phase, processing conditions, and concentrations of biopolymers that affect the formation and stability of the emulsion, can result in crucial information in the production of a suitable oleogel. In this research, the effects of oil concentration and pressure used in the manufacture of the emulsion prior to obtaining the oleogel have been evaluated through the analysis of droplet size and rheological properties of obtained emulsions and oleogels. The results show that the emulsion prepared in the high-pressure homogenizer (HPH) at higher pressure values has smaller droplet sizes and a higher uniformity in the size distribution curve. On the other hand, in relation to the rheological characteristics of the emulsions and oleogels obtained, the predominantly elastic character of the systems must be noted, as they present values of the storage modulus higher than those of losses, also showing an important plateau zone, typical of structured systems. In the same way, if steady-state viscous flow tests have been analyzed on both emulsions and oleogels, the result is that, once again, the pressure used in the homogenizer is an important factor for obtaining emulsions with adequate droplet size and the subsequent oleogel. Thus, various routes for trapping oil inside a biopolymer matrix with adjustable mechanical properties could be applied for the creation of the three-dimensional network in order to the oil absorption and creating oleogel.Keywords: basil seed gum, particle size, viscoelastic properties, whey protein
Procedia PDF Downloads 667223 A Monopole Intravascular Antenna with Three Parasitic Elements Optimized for Higher Tesla MRI Systems
Authors: Mohammad Mohammadzadeh, Alireza Ghasempour
Abstract:
In this paper, a new design of monopole antenna has been proposed that increases the contrast of intravascular magnetic resonance images through increasing the homogeneity of the intrinsic signal-to-noise ratio (ISNR) distribution around the antenna. The antenna is made of a coaxial cable with three parasitic elements. Lengths and positions of the elements are optimized by the improved genetic algorithm (IGA) for 1.5, 3, 4.7, and 7Tesla MRI systems based on a defined cost function. Simulations were also conducted to verify the performance of the designed antenna. Our simulation results show that each time IGA is executed different values for the parasitic elements are obtained so that the cost functions of those antennas are high. According to the obtained results, IGA can also find the best values for the parasitic elements (regarding cost function) in the next executions. Additionally, two dimensional and one-dimensional maps of ISNR were drawn for the proposed antenna and compared to the previously published monopole antenna with one parasitic element at the frequency of 64MHz inside a saline phantom. Results verified that in spite of ISNR decreasing, there is a considerable improvement in the homogeneity of ISNR distribution of the proposed antenna so that their multiplication increases.Keywords: intravascular MR antenna, monopole antenna, parasitic elements, signal-to-noise ratio (SNR), genetic algorithm
Procedia PDF Downloads 2997222 A Predictive Model for Turbulence Evolution and Mixing Using Machine Learning
Authors: Yuhang Wang, Jorg Schluter, Sergiy Shelyag
Abstract:
The high cost associated with high-resolution computational fluid dynamics (CFD) is one of the main challenges that inhibit the design, development, and optimisation of new combustion systems adapted for renewable fuels. In this study, we propose a physics-guided CNN-based model to predict turbulence evolution and mixing without requiring a traditional CFD solver. The model architecture is built upon U-Net and the inception module, while a physics-guided loss function is designed by introducing two additional physical constraints to allow for the conservation of both mass and pressure over the entire predicted flow fields. Then, the model is trained on the Large Eddy Simulation (LES) results of a natural turbulent mixing layer with two different Reynolds number cases (Re = 3000 and 30000). As a result, the model prediction shows an excellent agreement with the corresponding CFD solutions in terms of both spatial distributions and temporal evolution of turbulent mixing. Such promising model prediction performance opens up the possibilities of doing accurate high-resolution manifold-based combustion simulations at a low computational cost for accelerating the iterative design process of new combustion systems.Keywords: computational fluid dynamics, turbulence, machine learning, combustion modelling
Procedia PDF Downloads 917221 The Perception on 21st Century Skills of Nursing Instructors and Nursing Students at Boromarajonani College of Nursing, Chonburi
Authors: Kamolrat Turner, Somporn Rakkwamsuk, Ladda Leungratanamart
Abstract:
The aim of this descriptive study was to determine the perception of 21st century skills among nursing professors and nursing students at Boromarajonani College of Nursing, Chonburi. A total of 38 nursing professors and 75 second year nursing students took part in the study. Data were collected by 21st century skills questionnaires comprised of 63 items. Descriptive statistics were used to describe the findings. The results have shown that the overall mean scores of the perception of nursing professors on 21st century skills were at a high level. The highest mean scores were recorded for computing and ICT literacy, and career and leaning skills. The lowest mean scores were recorded for reading and writing and mathematics. The overall mean scores on perception of nursing students on 21st century skills were at a high level. The highest mean scores were recorded for computer and ICT literacy, for which the highest item mean scores were recorded for competency on computer programs. The lowest mean scores were recorded for the reading, writing, and mathematics components, in which the highest item mean score was reading Thai correctly, and the lowest item mean score was English reading and translate to other correctly. The findings from this study have shown that the perceptions of nursing professors were consistent with those of nursing students. Moreover, any activities aiming to raise capacity on English reading and translate information to others should be taken into the consideration.Keywords: 21st century skills, perception, nursing instructor, nursing student
Procedia PDF Downloads 3167220 Bioinformatic Strategies for the Production of Glycoproteins in Algae
Authors: Fadi Saleh, Çığdem Sezer Zhmurov
Abstract:
Biopharmaceuticals represent one of the wildest developing fields within biotechnology, and the biological macromolecules being produced inside cells have a variety of applications for therapies. In the past, mammalian cells, especially CHO cells, have been employed in the production of biopharmaceuticals. This is because these cells can achieve human-like completion of PTM. These systems, however, carry apparent disadvantages like high production costs, vulnerability to contamination, and limitations in scalability. This research is focused on the utilization of microalgae as a bioreactor system for the synthesis of biopharmaceutical glycoproteins in relation to PTMs, particularly N-glycosylation. The research points to a growing interest in microalgae as a potential substitute for more conventional expression systems. A number of advantages exist in the use of microalgae, including rapid growth rates, the lack of common human pathogens, controlled scalability in bioreactors, and the ability of some PTMs to take place. Thus, the potential of microalgae to produce recombinant proteins with favorable characteristics makes this a promising platform in order to produce biopharmaceuticals. The study focuses on the examination of the N-glycosylation pathways across different species of microalgae. This investigation is important as N-glycosylation—the process by which carbohydrate groups are linked to proteins—profoundly influences the stability, activity, and general performance of glycoproteins. Additionally, bioinformatics methodologies are employed to explain the genetic pathways implicated in N-glycosylation within microalgae, with the intention of modifying these organisms to produce glycoproteins suitable for human consumption. In this way, the present comparative analysis of the N-glycosylation pathway in humans and microalgae can be used to bridge both systems in order to produce biopharmaceuticals with humanized glycosylation profiles within the microalgal organisms. The results of the research underline microalgae's potential to help improve some of the limitations associated with traditional biopharmaceutical production systems. The study may help in the creation of a cost-effective and scale-up means of producing quality biopharmaceuticals by modifying microalgae genetically to produce glycoproteins with N-glycosylation that is compatible with humans. Improvements in effectiveness will benefit biopharmaceutical production and the biopharmaceutical sector with this novel, green, and efficient expression platform. This thesis, therefore, is thorough research into the viability of microalgae as an efficient platform for producing biopharmaceutical glycoproteins. Based on the in-depth bioinformatic analysis of microalgal N-glycosylation pathways, a platform for their engineering to produce human-compatible glycoproteins is set out in this work. The findings obtained in this research will have significant implications for the biopharmaceutical industry by opening up a new way of developing safer, more efficient, and economically more feasible biopharmaceutical manufacturing platforms.Keywords: microalgae, glycoproteins, post-translational modification, genome
Procedia PDF Downloads 247219 Selecting Answers for Questions with Multiple Answer Choices in Arabic Question Answering Based on Textual Entailment Recognition
Authors: Anes Enakoa, Yawei Liang
Abstract:
Question Answering (QA) system is one of the most important and demanding tasks in the field of Natural Language Processing (NLP). In QA systems, the answer generation task generates a list of candidate answers to the user's question, in which only one answer is correct. Answer selection is one of the main components of the QA, which is concerned with selecting the best answer choice from the candidate answers suggested by the system. However, the selection process can be very challenging especially in Arabic due to its particularities. To address this challenge, an approach is proposed to answer questions with multiple answer choices for Arabic QA systems based on Textual Entailment (TE) recognition. The developed approach employs a Support Vector Machine that considers lexical, semantic and syntactic features in order to recognize the entailment between the generated hypotheses (H) and the text (T). A set of experiments has been conducted for performance evaluation and the overall performance of the proposed method reached an accuracy of 67.5% with C@1 score of 80.46%. The obtained results are promising and demonstrate that the proposed method is effective for TE recognition task.Keywords: information retrieval, machine learning, natural language processing, question answering, textual entailment
Procedia PDF Downloads 1457218 Dynamic Mechanical Analysis of Supercooled Water in Nanoporous Confinement and Biological Systems
Authors: Viktor Soprunyuk, Wilfried Schranz, Patrick Huber
Abstract:
In the present work, we show that Dynamic Mechanical Analysis (DMA) with a measurement frequency range f= 0.2 - 100 Hz is a rather powerful technique for the study of phase transitions (freezing and melting) and glass transitions of water in geometrical confinement. Inserting water into nanoporous host matrices, like e.g. Gelsil (size of pores 2.6 nm and 5 nm) or Vycor (size of pores 10 nm) allows one to study size effects occurring at the nanoscale conveniently in macroscopic bulk samples. One obtains valuable insight concerning confinement induced changes of the dynamics by measuring the temperature and frequency dependencies of the complex Young's modulus Y* for various pore sizes. Solid-liquid transitions or glass-liquid transitions show up in a softening or the real part Y' of the complex Young's modulus, yet with completely different frequency dependencies. Analysing the frequency dependent imaginary part of the Young´s modulus in the glass transition regions for different pore sizes we find a clear-cut 1/d-dependence of the calculated glass transition temperatures which extrapolates to Tg(1/d=0)=136 K, in agreement with the traditional value of water. The results indicate that the main role of the pore diameter is the relative amount of water molecules that are near an interface within a length scale of the order of the dynamic correlation length x. Thus we argue that the observed strong pore size dependence of Tg is an interfacial effect, rather than a finite size effect. We obtained similar signatures of Y* near glass transitions in different biological objects (fruits, vegetables, and bread). The values of the activation energies for these biological materials in the region of glass transition are quite similar to the values of the activation energies of supercooled water in the nanoporous confinement in this region. The present work was supported by the Austrian Science Fund (FWF, project Nr. P 28672 – N36).Keywords: biological systems, liquids, glasses, amorphous systems, nanoporous materials, phase transition
Procedia PDF Downloads 2387217 Physical Properties and Elastic Studies of Fluoroaluminate Glasses Based on Alkali
Authors: C. Benhamideche
Abstract:
Fluoroaluminate glasses have been reported as the earliest heavy metal fluoride glasses. By comparison with flurozirconate glasses, they offer a set of similar optical features, but also some differences in their elastic and chemical properties. In practice they have been less developed because their stability against devitrification is smaller than that of the most stable fluoroziconates. The purpose of this study was to investigate glass formation in systems AlF3-YF3-PbF2-MgF2-MF2 (M= Li, Na, K). Synthesis was implemented at room atmosphere using the ammonium fluoride processing. After fining, the liquid was into a preheated brass mold, then annealed below the glass transition temperature for several hours. The samples were polished for optical measurements. Glass formation has been investigated in a systematic way, using pseudo ternary systems in order to allow parameters to vary at the same time. We have chosen the most stable glass compositions for the determination of the physical properties. These properties including characteristic temperatures, density and proprieties elastic. Glass stability increases in multicomponent glasses. Bulk samples have been prepared for physical characterization. These glasses have a potential interest for passive optical fibers because they are less sensitive to water attack than ZBLAN glass, mechanically stronger. It is expected they could have a larger damage threshold for laser power transmission.Keywords: fluoride glass, aluminium fluoride, thermal properties, density, proprieties elastic
Procedia PDF Downloads 241