Search results for: aquatic systems
7251 Prime Mover Sizing for Base-Loaded Combined Heating and Power Systems
Authors: Djalal Boualili
Abstract:
This article considers the problem of sizing prime movers for combined heating and power (CHP) systems operating at full load to satisfy a fraction of a facility's electric load, i.e. a base load. Prime mover sizing is examined using three criteria: operational cost, carbon dioxide emissions (CDE), and primary energy consumption (PEC). The sizing process leads to consider ratios of conversion factors applied to imported electricity to conversion factors applied to fuel consumed. These ratios are labelled RCost, R CDE, R PEC depending on whether the conversion factors are associated with operational cost, CDE, or PEC, respectively. Analytical results show that in order to achieve savings in operational cost, CDE, or PEC, the ratios must be larger than a unique constant R Min that only depends on the CHP components efficiencies. Savings in operational cost, CDE, or PEC due to CHP operation are explicitly formulated using simple equations. This facilitates the process of comparing the tradeoffs of optimizing the savings of one criterion over the other two – a task that has traditionally been accomplished through computer simulations. A hospital building, located in Chlef, Algeria, was used as an example to apply the methodology presented in this article.Keywords: sizing, heating and power, ratios, energy consumption, carbon dioxide emissions
Procedia PDF Downloads 2317250 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic
Authors: Budoor Al Abid
Abstract:
Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.Keywords: machine learning, adaptive, fuzzy logic, data mining
Procedia PDF Downloads 1967249 On the Implementation of The Pulse Coupled Neural Network (PCNN) in the Vision of Cognitive Systems
Authors: Hala Zaghloul, Taymoor Nazmy
Abstract:
One of the great challenges of the 21st century is to build a robot that can perceive and act within its environment and communicate with people, while also exhibiting the cognitive capabilities that lead to performance like that of people. The Pulse Coupled Neural Network, PCNN, is a relative new ANN model that derived from a neural mammal model with a great potential in the area of image processing as well as target recognition, feature extraction, speech recognition, combinatorial optimization, compressed encoding. PCNN has unique feature among other types of neural network, which make it a candid to be an important approach for perceiving in cognitive systems. This work show and emphasis on the potentials of PCNN to perform different tasks related to image processing. The main drawback or the obstacle that prevent the direct implementation of such technique, is the need to find away to control the PCNN parameters toward perform a specific task. This paper will evaluate the performance of PCNN standard model for processing images with different properties, and select the important parameters that give a significant result, also, the approaches towards find a way for the adaptation of the PCNN parameters to perform a specific task.Keywords: cognitive system, image processing, segmentation, PCNN kernels
Procedia PDF Downloads 2807248 Threat Modeling Methodology for Supporting Industrial Control Systems Device Manufacturers and System Integrators
Authors: Raluca Ana Maria Viziteu, Anna Prudnikova
Abstract:
Industrial control systems (ICS) have received much attention in recent years due to the convergence of information technology (IT) and operational technology (OT) that has increased the interdependence of safety and security issues to be considered. These issues require ICS-tailored solutions. That led to the need to creation of a methodology for supporting ICS device manufacturers and system integrators in carrying out threat modeling of embedded ICS devices in a way that guarantees the quality of the identified threats and minimizes subjectivity in the threat identification process. To research, the possibility of creating such a methodology, a set of existing standards, regulations, papers, and publications related to threat modeling in the ICS sector and other sectors was reviewed to identify various existing methodologies and methods used in threat modeling. Furthermore, the most popular ones were tested in an exploratory phase on a specific PLC device. The outcome of this exploratory phase has been used as a basis for defining specific characteristics of ICS embedded devices and their deployment scenarios, identifying the factors that introduce subjectivity in the threat modeling process of such devices, and defining metrics for evaluating the minimum quality requirements of identified threats associated to the deployment of the devices in existing infrastructures. Furthermore, the threat modeling methodology was created based on the previous steps' results. The usability of the methodology was evaluated through a set of standardized threat modeling requirements and a standardized comparison method for threat modeling methodologies. The outcomes of these verification methods confirm that the methodology is effective. The full paper includes the outcome of research on different threat modeling methodologies that can be used in OT, their comparison, and the results of implementing each of them in practice on a PLC device. This research is further used to build a threat modeling methodology tailored to OT environments; a detailed description is included. Moreover, the paper includes results of the evaluation of created methodology based on a set of parameters specifically created to rate threat modeling methodologies.Keywords: device manufacturers, embedded devices, industrial control systems, threat modeling
Procedia PDF Downloads 807247 Climate Smart Agriculture: Nano Technology in Solar Drying
Authors: Figen Kadirgan, M. A. Neset Kadirgan, Gokcen A. Ciftcioglu
Abstract:
Addressing food security and climate change challenges have to be done in an integrated manner. To increase food production and to reduce emissions intensity, thus contributing to mitigate climate change, food systems have to be more efficient in the use of resources. To ensure food security and adapt to climate change they have to become more resilient. The changes required in agricultural and food systems will require the creation of supporting institutions and enterprises to provide services and inputs to smallholders, fishermen and pastoralists, and transform and commercialize their production more efficiently. Thus there is continously growing need to switch to green economy where simultaneously causes reduction in carbon emissions and pollution, enhances energy and resource-use efficiency; and prevents the loss of biodiversity and ecosystem services. Smart Agriculture takes into account the four dimensions of food security, availability, accessibility, utilization, and stability. It is well known that, the increase in world population will strengthen the population-food imbalance. The emphasis on reduction of food losses makes a point on production, on farmers, on increasing productivity and income ensuring food security. Where also small farmers enhance their income and stabilize their budget. The use of solar drying for agricultural, marine or meat products is very important for preservation. Traditional sun drying is a relatively slow process where poor food quality is seen due to an infestation of insects, enzymatic reactions, microorganism growth and micotoxin development. In contrast, solar drying has a sound solution to all these negative effects of natural drying and artificial mechanical drying. The technical directions in the development of solar drying systems for agricultural products are compact collector design with high efficiency and low cost. In this study, using solar selective surface produced in Selektif Teknoloji Co. Inc. Ltd., solar dryers with high efficiency will be developed and a feasibility study will be realized.Keywords: energy, renewable energy, solar collector, solar drying
Procedia PDF Downloads 2257246 The Use of PD and Tanδ Characteristics as Diagnostic Technique for the Insulation Integrity of XLPE Insulated Cable Joints
Authors: Mazen Al-Bulaihed, Nissar Wani, Abdulrahman Al-Arainy, Yasin Khan
Abstract:
Partial Discharge (PD) measurements are widely used for diagnostic purposes in electrical equipment used in power systems. The main cause of these measurements is to prevent large power failures as cables are prone to aging, which usually results in embrittlement, cracking and eventual failure of the insulating and sheathing materials, exposing the conductor and risking a potential short circuit, a likely cause of the electrical fire. Many distribution networks rely heavily on medium voltage (MV) power cables. The presence of joints in these networks is a vital part of serving the consumer demand for electricity continuously. Such measurements become even more important when the extent of dependence increases. Moreover, it is known that the partial discharge in joints and termination are difficult to track and are the most crucial point of failures in large power systems. This paper discusses the diagnostic techniques of four samples of XLPE insulated cable joints, each included with a different type of defect. Experiments were carried out by measuring PD and tanδ at very low frequency applied high voltage. The results show the importance of combining PD and tanδ for effective cable assessment.Keywords: partial discharge, tan delta, very low frequency, XLPE cable
Procedia PDF Downloads 1637245 FLIME - Fast Low Light Image Enhancement for Real-Time Video
Authors: Vinay P., Srinivas K. S.
Abstract:
Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.Keywords: low light image enhancement, real-time video, computer vision, machine learning
Procedia PDF Downloads 2067244 The Computational Psycholinguistic Situational-Fuzzy Self-Controlled Brain and Mind System Under Uncertainty
Authors: Ben Khayut, Lina Fabri, Maya Avikhana
Abstract:
The models of the modern Artificial Narrow Intelligence (ANI) cannot: a) independently and continuously function without of human intelligence, used for retraining and reprogramming the ANI’s models, and b) think, understand, be conscious, cognize, infer, and more in state of Uncertainty, and changes in situations, and environmental objects. To eliminate these shortcomings and build a new generation of Artificial Intelligence systems, the paper proposes a Conception, Model, and Method of Computational Psycholinguistic Cognitive Situational-Fuzzy Self-Controlled Brain and Mind System (CPCSFSCBMSUU) using a neural network as its computational memory, operating under uncertainty, and activating its functions by perception, identification of real objects, fuzzy situational control, forming images of these objects, modeling their psychological, linguistic, cognitive, and neural values of properties and features, the meanings of which are identified, interpreted, generated, and formed taking into account the identified subject area, using the data, information, knowledge, and images, accumulated in the Memory. The functioning of the CPCSFSCBMSUU is carried out by its subsystems of the: fuzzy situational control of all processes, computational perception, identifying of reactions and actions, Psycholinguistic Cognitive Fuzzy Logical Inference, Decision making, Reasoning, Systems Thinking, Planning, Awareness, Consciousness, Cognition, Intuition, Wisdom, analysis and processing of the psycholinguistic, subject, visual, signal, sound and other objects, accumulation and using the data, information and knowledge in the Memory, communication, and interaction with other computing systems, robots and humans in order of solving the joint tasks. To investigate the functional processes of the proposed system, the principles of Situational Control, Fuzzy Logic, Psycholinguistics, Informatics, and modern possibilities of Data Science were applied. The proposed self-controlled System of Brain and Mind is oriented on use as a plug-in in multilingual subject Applications.Keywords: computational brain, mind, psycholinguistic, system, under uncertainty
Procedia PDF Downloads 1777243 Integration of PV Systems in Residential Buildings: A Solution for Supporting Electrical Grid in Kuwait
Authors: Nabil A. Ahmed, Nasser A. N. Mhaisen
Abstract:
The paper presents a solution to enhance the power quality and to reduce the peak load demand in Kuwait electric grid as a solution to the shortage of electricity production. Technical, environmental and economic feasibility study of utilizing integrated grid-connected photovoltaic (PV) system in residential buildings for supplying 7.1% of electrical power consumption in Kuwait is carried out using RETScreen software. A 10 KWp on-grid PV power generation system spread on the rooftop of the residential buildings is adopted and investigated and the complete system performance is simulated using PSIM software. Taking into account the international prices of electricity and natural gas, the proposed solution is investigated and tested for four different types of installation systems in terms of power generation and costs which includes horizontal installation, 25º tilted angle, single axis tracking and dual axis tracking. Results shows that the 25º tilted angle fixed mounted system is the most efficient type. The payback period as a tool of benefit analysis of the proposed system is calculated and it found to be 2.55 years.Keywords: photovoltaics, residential buildings, electrical grid, production capacity, on-grid, power generation
Procedia PDF Downloads 4947242 Combined Safety and Cybersecurity Risk Assessment for Intelligent Distributed Grids
Authors: Anders Thorsén, Behrooz Sangchoolie, Peter Folkesson, Ted Strandberg
Abstract:
As more parts of the power grid become connected to the internet, the risk of cyberattacks increases. To identify the cybersecurity threats and subsequently reduce vulnerabilities, the common practice is to carry out a cybersecurity risk assessment. For safety classified systems and products, there is also a need for safety risk assessments in addition to the cybersecurity risk assessment in order to identify and reduce safety risks. These two risk assessments are usually done separately, but since cybersecurity and functional safety are often related, a more comprehensive method covering both aspects is needed. Some work addressing this has been done for specific domains like the automotive domain, but more general methods suitable for, e.g., intelligent distributed grids, are still missing. One such method from the automotive domain is the Security-Aware Hazard Analysis and Risk Assessment (SAHARA) method that combines safety and cybersecurity risk assessments. This paper presents an approach where the SAHARA method has been modified in order to be more suitable for larger distributed systems. The adapted SAHARA method has a more general risk assessment approach than the original SAHARA. The proposed method has been successfully applied on two use cases of an intelligent distributed grid.Keywords: intelligent distribution grids, threat analysis, risk assessment, safety, cybersecurity
Procedia PDF Downloads 1537241 Transforming Data Science Curriculum Through Design Thinking
Authors: Samar Swaid
Abstract:
Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.Keywords: data science, design thinking, AI, currculum, transformation
Procedia PDF Downloads 817240 Spatially Distributed Rainfall Prediction Based on Automated Kriging for Landslide Early Warning Systems
Authors: Ekrem Canli, Thomas Glade
Abstract:
The precise prediction of rainfall in space and time is a key element to most landslide early warning systems. Unfortunately, the spatial variability of rainfall in many early warning applications is often disregarded. A common simplification is to use uniformly distributed rainfall to characterize aerial rainfall intensity. With spatially differentiated rainfall information, real-time comparison with rainfall thresholds or the implementation in process-based approaches might form the basis for improved landslide warnings. This study suggests an automated workflow from the hourly, web-based collection of rain gauge data to the generation of spatially differentiated rainfall predictions based on kriging. Because the application of kriging is usually a labor intensive task, a simplified and consequently automated variogram modeling procedure was applied to up-to-date rainfall data. The entire workflow was carried out purely with open source technology. Validation results, albeit promising, pointed out the challenges that are involved in pure distance based, automated geostatistical interpolation techniques for ever-changing environmental phenomena over short temporal and spatial extent.Keywords: kriging, landslide early warning system, spatial rainfall prediction, variogram modelling, web scraping
Procedia PDF Downloads 2807239 Developing a Green Information Technology Model in Australian Higher-Educational Institutions
Authors: Mahnaz Jafari, Parisa Izadpanahi, Francesco Mancini, Muhammad Qureshi
Abstract:
The advancement in Information Technology (IT) has been an intrinsic element in the developments of the 21st century bringing benefits such as increased economic productivity. However, its widespread application has also been associated with inadvertent negative impacts on society and the environment necessitating selective interventions to mitigate these impacts. This study responded to this need by developing a Green IT Rating Tool (GIRT) for higher education institutions (HEI) in Australia to evaluate the sustainability of IT-related practices from an environmental, social, and economic perspective. Each dimension must be considered equally to achieve sustainability. The development of the GIRT was informed by the views of interviewed IT professionals whose opinions formed the basis of a framework listing Green IT initiatives in order of their importance as perceived by the interviewed professionals. This framework formed the base of the GIRT, which identified Green IT initiatives (such as videoconferencing as a substitute for long-distance travel) and the associated weighting of each practice. The proposed sustainable Green IT model could be integrated into existing IT systems, leading to significant reductions in carbon emissions and e-waste and improvements in energy efficiency. The development of the GIRT and the findings of this study have the potential to inspire other organizations to adopt sustainable IT practices, positively impact the environment, and be used as a reference by IT professionals and decision-makers to evaluate IT-related sustainability practices. The GIRT could also serve as a benchmark for HEIs to compare their performance with other institutions and to track their progress over time. Additionally, the study's results suggest that virtual and cloud-based technologies could reduce e-waste and energy consumption in the higher education sector. Overall, this study highlights the importance of incorporating Green IT practices into the IT systems of HEI to contribute to a more sustainable future.Keywords: green information technology, international higher-educational institution, sustainable solutions, environmentally friendly IT systems
Procedia PDF Downloads 767238 The Effect of Implant Design on the Height of Inter-Implant Bone Crest: A 10-Year Retrospective Study of the Astra Tech Implant and Branemark Implant
Authors: Daeung Jung
Abstract:
Background: In case of patients with missing teeth, multiple implant restoration has been widely used and is inevitable. To increase its survival rate, it is important to understand the influence of different implant designs on inter-implant crestal bone resorption. There are several implant systems designed to minimize loss of crestal bone, and the Astra Tech and Brånemark Implant are two of them. Aim/Hypothesis: The aim of this 10-year study was to compare the height of inter-implant bone crest in two implant systems; the Astra Tech and the Brånemark implant system. Material and Methods: In this retrospective study, 40 consecutively treated patients were utilized; 23 patients with 30 sites for Astra Tech system and 17 patients with 20 sites for Brånemark system. The implant restoration was comprised of splinted crown in partially edentulous patients. Radiographs were taken immediately after 1st surgery, at impression making, at prosthetics setting, and annually after loading. Lateral distance from implant to bone crest, inter-implant distance was gauged, and crestal bone height was measured from the implant shoulder to the first bone contact. Calibrations were performed with known length of thread pitch distance for vertical measurement, and known diameter of abutment or fixture for horizontal measurement using ImageJ. Results: After 10 years, patients treated with Astra Tech implant system demonstrated less inter-implant crestal bone resorption when implants had a distance of 3mm or less between them. In cases of implants that had a greater than 3 mm distance between them, however, there appeared to be no statistically significant difference in crestal bone loss between two systems. Conclusion and clinical implications: In the situation of partially edentulous patients planning to have more than two implants, the inter-implant distance is one of the most important factors to be considered. If it is impossible to make sure of having sufficient inter-implant distance, the implants with less micro gap in the fixture-abutment junction, less traumatic 2nd surgery approach, and the adequate surface topography would be choice of appropriate options to minimize inter-implant crestal bone resorption.Keywords: implant design, crestal bone loss, inter-implant distance, 10-year retrospective study
Procedia PDF Downloads 1667237 Enhancing Robustness in Federated Learning through Decentralized Oracle Consensus and Adaptive Evaluation
Authors: Peiming Li
Abstract:
This paper presents an innovative blockchain-based approach to enhance the reliability and efficiency of federated learning systems. By integrating a decentralized oracle consensus mechanism into the federated learning framework, we address key challenges of data and model integrity. Our approach utilizes a network of redundant oracles, functioning as independent validators within an epoch-based training system in the federated learning model. In federated learning, data is decentralized, residing on various participants' devices. This scenario often leads to concerns about data integrity and model quality. Our solution employs blockchain technology to establish a transparent and tamper-proof environment, ensuring secure data sharing and aggregation. The decentralized oracles, a concept borrowed from blockchain systems, act as unbiased validators. They assess the contributions of each participant using a Hidden Markov Model (HMM), which is crucial for evaluating the consistency of participant inputs and safeguarding against model poisoning and malicious activities. Our methodology's distinct feature is its epoch-based training. An epoch here refers to a specific training phase where data is updated and assessed for quality and relevance. The redundant oracles work in concert to validate data updates during these epochs, enhancing the system's resilience to security threats and data corruption. The effectiveness of this system was tested using the Mnist dataset, a standard in machine learning for benchmarking. Results demonstrate that our blockchain-oriented federated learning approach significantly boosts system resilience, addressing the common challenges of federated environments. This paper aims to make these advanced concepts accessible, even to those with a limited background in blockchain or federated learning. We provide a foundational understanding of how blockchain technology can revolutionize data integrity in decentralized systems and explain the role of oracles in maintaining model accuracy and reliability.Keywords: federated learning system, block chain, decentralized oracles, hidden markov model
Procedia PDF Downloads 637236 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand
Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones
Abstract:
As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem
Procedia PDF Downloads 2487235 Micro-Droplet Formation in a Microchannel under the Effect of an Electric Field: Experiment
Authors: Sercan Altundemir, Pinar Eribol, A. Kerem Uguz
Abstract:
Microfluidics systems allow many-large scale laboratory applications to be miniaturized on a single device in order to reduce cost and advance fluid control. Moreover, such systems enable to generate and control droplets which have a significant role on improved analysis for many chemical and biological applications. For example, they can be employed as the model for cells in microfluidic systems. In this work, the interfacial instability of two immiscible Newtonian liquids flowing in a microchannel is investigated. When two immiscible liquids are in laminar regime, a flat interface is formed between them. If a direct current electric field is applied, the interface may deform, i.e. may become unstable and it may be ruptured and form micro-droplets. First, the effect of thickness ratio, total flow rate, viscosity ratio of the silicone oil and ethylene glycol liquid couple on the critical voltage at which the interface starts to destabilize is investigated. Then the droplet sizes are measured under the effect of these parameters at various voltages. Moreover, the effect of total flow rate on the time elapsed for the interface to be ruptured to form droplets by hitting the wall of the channel is analyzed. It is observed that an increase in the viscosity or the thickness ratio of the silicone oil to the ethylene glycol has a stabilizing effect, i.e. a higher voltage is needed while the total flow rate has no effect on it. However, it is observed that an increase in the total flow rate results in shortening of the elapsed time for the interface to hit the wall. Moreover, the droplet size decreases down to 0.1 μL with an increase in the applied voltage, the viscosity ratio or the total flow rate or a decrease in the thickness ratio. In addition to these observations, two empirical models for determining the critical electric number, i.e., the dimensionless voltage and the droplet size and another model which is a combination of both models, for determining the droplet size at the critical voltage are established.Keywords: droplet formation, electrohydrodynamics, microfluidics, two-phase flow
Procedia PDF Downloads 1767234 An Integral Sustainable Design Evaluation of the 15-Minute City and the Processes of Transferability to Cities of the Global South
Authors: Chitsanzo Isaac
Abstract:
Across the world, the ongoing Covid-19 pandemic has challenged urban systems and policy frameworks, highlighting societal vulnerabilities and systemic inequities among many communities. Measures of confinement and social distancing to contain the Covid-19 virus have fragmented the physical and social fabric of cities. This has caused urban dwellers to reassess how they engage with their urban surroundings and maintain social ties. Urbanists have presented strategies that would allow communities to survive and even thrive, in extraordinary times of crisis like the pandemic. Tactical Urbanism, particularly the 15-Minute City, has gained popularity. It is considered a resilient approach in the global north, however, it’s transferability to the global south has been called into question. To this end, this paper poses the question: to what extent is the 15-Minute City framework integral sustainable design, and are there processes that make it adoptable by cities in the global south? This paper explores four issues using secondary quantitative data analysis and convergence analysis in the Paris and Blantyre urban regions. First, it questions how the 15-Minute City has been defined and measured, and how it impacts urban dwellers. Second, it examines the extent to which the 15-minute city performs under the lens of frameworks such as Wilber’s integral theory and Fleming’s integral sustainable design theory. Thirdly this work examines the processes that can be transferred to developing cities which foster community resilience through the perspectives of experience, behaviors, cultures, and systems. Finally, it reviews the principal ways in which a multi-perspective reality can be the basis for resilient community design and sustainable urban development. This work will shed a light on the importance of a multi-perspective reality as a means of achieving sustainable urban design goals in developing urban areas.Keywords: 15-minute city, developing cities, global south, community resilience, integral sustainable design, systems thinking, complexity, tactical urbanism
Procedia PDF Downloads 1507233 Study and Calibration of Autonomous UAV Systems With Thermal Sensing With Multi-purpose Roles
Authors: Raahil Sheikh, Prathamesh Minde, Priya Gujjar, Himanshu Dwivedi, Abhishek Maurya
Abstract:
UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoidedKeywords: UAV, autonomous systems, drones, geo thermal imaging
Procedia PDF Downloads 857232 A Review of Material and Methods Used in Liner Layers in Various Landfills
Authors: S. Taghvamanesh
Abstract:
Modern landfills are highly engineered containment systems that are designed to reduce the environmental and human health impacts of solid waste (trash). In modern landfills, waste is contained by a liner system. The primary goal of the liner system is to isolate the landfill contents from the environment, thereby protecting the soil and groundwater from pollution caused by the leachate of a landfill. Landfill leachate is the most serious threat to groundwater. Therefore, it is necessary to design a system that prevents the penetration of this dangerous substance into the environment. These layers are made up of two basic elements: clay and geosynthetics. Hydraulic conductivity and flexibility are two desirable properties of these materials. There are three different types of liner systems that will be discussed in this paper. According to available data, the current article analyzed materials and methods for constructing liner layers made of distinct leachates, including various harmful components and heavy metals from all around the world. Also, this study attempted to gather data on leachates for each of the sites discussed. In conclusion, every landfill requires a specific type of liner, which depends on the type of leachate that it produces daily. It should also be emphasized that, based on available data, this article focused on the number of landfills that each country or continent possesses.Keywords: landfill, liner layer, impervious layer, barrier layer
Procedia PDF Downloads 777231 Investigation of Processing Conditions on Rheological Features of Emulsion Gels and Oleogels Stabilized by Biopolymers
Authors: M. Sarraf, J. E. Moros, M. C. Sánchez
Abstract:
Oleogels are self-standing systems that are able to trap edible liquid oil into a tridimensional network and also help to use less fat by forming crystallization oleogelators. There are different ways to generate oleogelation and oil structuring, including direct dispersion, structured biphasic systems, oil sorption, and indirect method (emulsion-template). The selection of processing conditions as well as the composition of the oleogels is essential to obtain a stable oleogel with characteristics suitable for its purpose. In this sense, one of the ingredients widely used in food products to produce oleogels and emulsions is polysaccharides. Basil seed gum (BSG), with the scientific name Ocimum basilicum, is a new native polysaccharide with high viscosity and pseudoplastic behavior because of its high molecular weight in the food industry. Also, proteins can stabilize oil in water due to the presence of amino and carboxyl moieties that result in surface activity. Whey proteins are widely used in the food industry due to available, cheap ingredients, nutritional and functional characteristics such as emulsifier and a gelling agent, thickening, and water-binding capacity. In general, the interaction of protein and polysaccharides has a significant effect on the food structures and their stability, like the texture of dairy products, by controlling the interactions in macromolecular systems. Using edible oleogels as oil structuring helps for targeted delivery of a component trapped in a structural network. Therefore, the development of efficient oleogel is essential in the food industry. A complete understanding of the important points, such as the ratio oil phase, processing conditions, and concentrations of biopolymers that affect the formation and stability of the emulsion, can result in crucial information in the production of a suitable oleogel. In this research, the effects of oil concentration and pressure used in the manufacture of the emulsion prior to obtaining the oleogel have been evaluated through the analysis of droplet size and rheological properties of obtained emulsions and oleogels. The results show that the emulsion prepared in the high-pressure homogenizer (HPH) at higher pressure values has smaller droplet sizes and a higher uniformity in the size distribution curve. On the other hand, in relation to the rheological characteristics of the emulsions and oleogels obtained, the predominantly elastic character of the systems must be noted, as they present values of the storage modulus higher than those of losses, also showing an important plateau zone, typical of structured systems. In the same way, if steady-state viscous flow tests have been analyzed on both emulsions and oleogels, the result is that, once again, the pressure used in the homogenizer is an important factor for obtaining emulsions with adequate droplet size and the subsequent oleogel. Thus, various routes for trapping oil inside a biopolymer matrix with adjustable mechanical properties could be applied for the creation of the three-dimensional network in order to the oil absorption and creating oleogel.Keywords: basil seed gum, particle size, viscoelastic properties, whey protein
Procedia PDF Downloads 667230 A Monopole Intravascular Antenna with Three Parasitic Elements Optimized for Higher Tesla MRI Systems
Authors: Mohammad Mohammadzadeh, Alireza Ghasempour
Abstract:
In this paper, a new design of monopole antenna has been proposed that increases the contrast of intravascular magnetic resonance images through increasing the homogeneity of the intrinsic signal-to-noise ratio (ISNR) distribution around the antenna. The antenna is made of a coaxial cable with three parasitic elements. Lengths and positions of the elements are optimized by the improved genetic algorithm (IGA) for 1.5, 3, 4.7, and 7Tesla MRI systems based on a defined cost function. Simulations were also conducted to verify the performance of the designed antenna. Our simulation results show that each time IGA is executed different values for the parasitic elements are obtained so that the cost functions of those antennas are high. According to the obtained results, IGA can also find the best values for the parasitic elements (regarding cost function) in the next executions. Additionally, two dimensional and one-dimensional maps of ISNR were drawn for the proposed antenna and compared to the previously published monopole antenna with one parasitic element at the frequency of 64MHz inside a saline phantom. Results verified that in spite of ISNR decreasing, there is a considerable improvement in the homogeneity of ISNR distribution of the proposed antenna so that their multiplication increases.Keywords: intravascular MR antenna, monopole antenna, parasitic elements, signal-to-noise ratio (SNR), genetic algorithm
Procedia PDF Downloads 2997229 A Predictive Model for Turbulence Evolution and Mixing Using Machine Learning
Authors: Yuhang Wang, Jorg Schluter, Sergiy Shelyag
Abstract:
The high cost associated with high-resolution computational fluid dynamics (CFD) is one of the main challenges that inhibit the design, development, and optimisation of new combustion systems adapted for renewable fuels. In this study, we propose a physics-guided CNN-based model to predict turbulence evolution and mixing without requiring a traditional CFD solver. The model architecture is built upon U-Net and the inception module, while a physics-guided loss function is designed by introducing two additional physical constraints to allow for the conservation of both mass and pressure over the entire predicted flow fields. Then, the model is trained on the Large Eddy Simulation (LES) results of a natural turbulent mixing layer with two different Reynolds number cases (Re = 3000 and 30000). As a result, the model prediction shows an excellent agreement with the corresponding CFD solutions in terms of both spatial distributions and temporal evolution of turbulent mixing. Such promising model prediction performance opens up the possibilities of doing accurate high-resolution manifold-based combustion simulations at a low computational cost for accelerating the iterative design process of new combustion systems.Keywords: computational fluid dynamics, turbulence, machine learning, combustion modelling
Procedia PDF Downloads 917228 Bioinformatic Strategies for the Production of Glycoproteins in Algae
Authors: Fadi Saleh, Çığdem Sezer Zhmurov
Abstract:
Biopharmaceuticals represent one of the wildest developing fields within biotechnology, and the biological macromolecules being produced inside cells have a variety of applications for therapies. In the past, mammalian cells, especially CHO cells, have been employed in the production of biopharmaceuticals. This is because these cells can achieve human-like completion of PTM. These systems, however, carry apparent disadvantages like high production costs, vulnerability to contamination, and limitations in scalability. This research is focused on the utilization of microalgae as a bioreactor system for the synthesis of biopharmaceutical glycoproteins in relation to PTMs, particularly N-glycosylation. The research points to a growing interest in microalgae as a potential substitute for more conventional expression systems. A number of advantages exist in the use of microalgae, including rapid growth rates, the lack of common human pathogens, controlled scalability in bioreactors, and the ability of some PTMs to take place. Thus, the potential of microalgae to produce recombinant proteins with favorable characteristics makes this a promising platform in order to produce biopharmaceuticals. The study focuses on the examination of the N-glycosylation pathways across different species of microalgae. This investigation is important as N-glycosylation—the process by which carbohydrate groups are linked to proteins—profoundly influences the stability, activity, and general performance of glycoproteins. Additionally, bioinformatics methodologies are employed to explain the genetic pathways implicated in N-glycosylation within microalgae, with the intention of modifying these organisms to produce glycoproteins suitable for human consumption. In this way, the present comparative analysis of the N-glycosylation pathway in humans and microalgae can be used to bridge both systems in order to produce biopharmaceuticals with humanized glycosylation profiles within the microalgal organisms. The results of the research underline microalgae's potential to help improve some of the limitations associated with traditional biopharmaceutical production systems. The study may help in the creation of a cost-effective and scale-up means of producing quality biopharmaceuticals by modifying microalgae genetically to produce glycoproteins with N-glycosylation that is compatible with humans. Improvements in effectiveness will benefit biopharmaceutical production and the biopharmaceutical sector with this novel, green, and efficient expression platform. This thesis, therefore, is thorough research into the viability of microalgae as an efficient platform for producing biopharmaceutical glycoproteins. Based on the in-depth bioinformatic analysis of microalgal N-glycosylation pathways, a platform for their engineering to produce human-compatible glycoproteins is set out in this work. The findings obtained in this research will have significant implications for the biopharmaceutical industry by opening up a new way of developing safer, more efficient, and economically more feasible biopharmaceutical manufacturing platforms.Keywords: microalgae, glycoproteins, post-translational modification, genome
Procedia PDF Downloads 247227 Selecting Answers for Questions with Multiple Answer Choices in Arabic Question Answering Based on Textual Entailment Recognition
Authors: Anes Enakoa, Yawei Liang
Abstract:
Question Answering (QA) system is one of the most important and demanding tasks in the field of Natural Language Processing (NLP). In QA systems, the answer generation task generates a list of candidate answers to the user's question, in which only one answer is correct. Answer selection is one of the main components of the QA, which is concerned with selecting the best answer choice from the candidate answers suggested by the system. However, the selection process can be very challenging especially in Arabic due to its particularities. To address this challenge, an approach is proposed to answer questions with multiple answer choices for Arabic QA systems based on Textual Entailment (TE) recognition. The developed approach employs a Support Vector Machine that considers lexical, semantic and syntactic features in order to recognize the entailment between the generated hypotheses (H) and the text (T). A set of experiments has been conducted for performance evaluation and the overall performance of the proposed method reached an accuracy of 67.5% with C@1 score of 80.46%. The obtained results are promising and demonstrate that the proposed method is effective for TE recognition task.Keywords: information retrieval, machine learning, natural language processing, question answering, textual entailment
Procedia PDF Downloads 1457226 Dynamic Mechanical Analysis of Supercooled Water in Nanoporous Confinement and Biological Systems
Authors: Viktor Soprunyuk, Wilfried Schranz, Patrick Huber
Abstract:
In the present work, we show that Dynamic Mechanical Analysis (DMA) with a measurement frequency range f= 0.2 - 100 Hz is a rather powerful technique for the study of phase transitions (freezing and melting) and glass transitions of water in geometrical confinement. Inserting water into nanoporous host matrices, like e.g. Gelsil (size of pores 2.6 nm and 5 nm) or Vycor (size of pores 10 nm) allows one to study size effects occurring at the nanoscale conveniently in macroscopic bulk samples. One obtains valuable insight concerning confinement induced changes of the dynamics by measuring the temperature and frequency dependencies of the complex Young's modulus Y* for various pore sizes. Solid-liquid transitions or glass-liquid transitions show up in a softening or the real part Y' of the complex Young's modulus, yet with completely different frequency dependencies. Analysing the frequency dependent imaginary part of the Young´s modulus in the glass transition regions for different pore sizes we find a clear-cut 1/d-dependence of the calculated glass transition temperatures which extrapolates to Tg(1/d=0)=136 K, in agreement with the traditional value of water. The results indicate that the main role of the pore diameter is the relative amount of water molecules that are near an interface within a length scale of the order of the dynamic correlation length x. Thus we argue that the observed strong pore size dependence of Tg is an interfacial effect, rather than a finite size effect. We obtained similar signatures of Y* near glass transitions in different biological objects (fruits, vegetables, and bread). The values of the activation energies for these biological materials in the region of glass transition are quite similar to the values of the activation energies of supercooled water in the nanoporous confinement in this region. The present work was supported by the Austrian Science Fund (FWF, project Nr. P 28672 – N36).Keywords: biological systems, liquids, glasses, amorphous systems, nanoporous materials, phase transition
Procedia PDF Downloads 2377225 Physical Properties and Elastic Studies of Fluoroaluminate Glasses Based on Alkali
Authors: C. Benhamideche
Abstract:
Fluoroaluminate glasses have been reported as the earliest heavy metal fluoride glasses. By comparison with flurozirconate glasses, they offer a set of similar optical features, but also some differences in their elastic and chemical properties. In practice they have been less developed because their stability against devitrification is smaller than that of the most stable fluoroziconates. The purpose of this study was to investigate glass formation in systems AlF3-YF3-PbF2-MgF2-MF2 (M= Li, Na, K). Synthesis was implemented at room atmosphere using the ammonium fluoride processing. After fining, the liquid was into a preheated brass mold, then annealed below the glass transition temperature for several hours. The samples were polished for optical measurements. Glass formation has been investigated in a systematic way, using pseudo ternary systems in order to allow parameters to vary at the same time. We have chosen the most stable glass compositions for the determination of the physical properties. These properties including characteristic temperatures, density and proprieties elastic. Glass stability increases in multicomponent glasses. Bulk samples have been prepared for physical characterization. These glasses have a potential interest for passive optical fibers because they are less sensitive to water attack than ZBLAN glass, mechanically stronger. It is expected they could have a larger damage threshold for laser power transmission.Keywords: fluoride glass, aluminium fluoride, thermal properties, density, proprieties elastic
Procedia PDF Downloads 2417224 Socio-Political Crisis in the North West and South West Regions of Cameroon and the Emergence of New Cultures
Authors: Doreen Mekunda
Abstract:
This paper is built on the premise that the current socio-political crisis in the two restive regions of Cameroon, though enveloped with destructive and devastating trends (effects) on both property and human lives, is not without its strengths and merits. It is incontestable that many cultures, to a greater extent, are going to be destroyed as people forcibly move from war-stricken habitats to non-violent places. Many cultural potentials, traditional shrines, artifacts, art, and crafts, etc., are unknowingly or knowingly disfigured, and many other ugly things will, by the end of the crisis, affect the cultures of these two regions under siege and of the receiving population. A plethora of other problems like the persecution of Internally Displaced Persons (IDPs) for being displaced and blamed for increased crime rates and the existence of cultural and ethnic differences that produce both inter-tribal and interpersonal conflicts and conflicts between communities will abound. However, there is the emergence of rapid literature, and other forms of cultural productions, whether written or oral, is visible, thereby precipitating a rich cultural diversity due to the coming together of a variety of cultures of both the IDPs and the receiving populations, rapid urbanization, improvement of health-related issues, the rebirth of indigenous cultural practices, the development of social and lingua-cultural competences, dependence on alternative religions, faith and spirituality. Even financial and economic dependence, though a burden to others by IDPs, has its own merits as it improves the living standards of the IDPs. To be able to obtain plausible results, cultural materialism, which is a literary theory that hinges on the empirical study of socio-cultural systems within a materialist infrastructure-super-structure framework, is employed together with the postcolonial theory. Postcolonial theory because the study deals with postcolonial experiences/tenets of migration, hybridity, ethnicity, indignity, language, double consciousness, migration, center/margin binaries, and identity, amongst others. The study reveals that the involuntary movement of persons from their habitual homes brings about movement in cultures, thus, the emergence of new cultures. The movement of people who hold fast to their cultural heritage can only influence new forms of literature, the development of new communication competences, the rise of alternative religion, faith and spirituality, the re-emergence of customary and traditional legal systems that might have been abandoned for the new judicial systems, and above all the revitalization of traditional health care systems.Keywords: alternative religion, emergence, socio-political crisis, spirituality, lingua-cultural competences
Procedia PDF Downloads 1787223 Investigation of Optimal Parameter Settings in Super Duplex Stainless Steel Welding Welding
Authors: R. M. Chandima Ratnayake, Daniel Dyakov
Abstract:
Super steel materials play vital role in construction and fabrication of structural, piping and pipeline components. They enable to minimize the life cycle costs in assuring the integrity of onshore and offshore operating systems. In this context, Duplex stainless steel (DSS) material related welding on constructions and fabrications play a significant role in maintaining and assuring integrity at an optimal expenditure over the life cycle of production and process systems as well as associated structures. In DSS welding, the factors such as gap geometry, shielding gas supply rate, welding current, and type of the welding process play a vital role on the final joint performance. Hence, an experimental investigation has been performed using engineering robust design approach (ERDA) to investigate the optimal settings that generate optimal super DSS (i.e. UNS S32750) joint performance. This manuscript illustrates the mathematical approach and experimental design, optimal parameter settings and results of verification experiment.Keywords: duplex stainless steel welding, engineering robust design, mathematical framework, optimal parameter settings
Procedia PDF Downloads 4157222 Leakage Current Analysis of FinFET Based 7T SRAM at 32nm Technology
Authors: Chhavi Saxena
Abstract:
FinFETs can be a replacement for bulk-CMOS transistors in many different designs. Its low leakage/standby power property makes FinFETs a desirable option for memory sub-systems. Memory modules are widely used in most digital and computer systems. Leakage power is very important in memory cells since most memory applications access only one or very few memory rows at a given time. As technology scales down, the importance of leakage current and power analysis for memory design is increasing. In this paper, we discover an option for low power interconnect synthesis at the 32nm node and beyond, using Fin-type Field-Effect Transistors (FinFETs) which are a promising substitute for bulk CMOS at the considered gate lengths. We consider a mechanism for improving FinFETs efficiency, called variable supply voltage schemes. In this paper, we’ve illustrated the design and implementation of FinFET based 4x4 SRAM cell array by means of one bit 7T SRAM. FinFET based 7T SRAM has been designed and analysis have been carried out for leakage current, dynamic power and delay. For the validation of our design approach, the output of FinFET SRAM array have been compared with standard CMOS SRAM and significant improvements are obtained in proposed model.Keywords: FinFET, 7T SRAM cell, leakage current, delay
Procedia PDF Downloads 455