Search results for: control and optimization techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19098

Search results for: control and optimization techniques

15678 Development of Wave-Dissipating Block Installation Simulation for Inexperienced Worker Training

Authors: Hao Min Chuah, Tatsuya Yamazaki, Ryosui Iwasawa, Tatsumi Suto

Abstract:

In recent years, with the advancement of digital technology, the movement to introduce so-called ICT (Information and Communication Technology), such as computer technology and network technology, to civil engineering construction sites and construction sites is accelerating. As part of this movement, attempts are being made in various situations to reproduce actual sites inside computers and use them for designing and construction planning, as well as for training inexperienced engineers. The installation of wave-dissipating blocks on coasts, etc., is a type of work that has been carried out by skilled workers based on their years of experience and is one of the tasks that is difficult for inexperienced workers to carry out on site. Wave-dissipating blocks are structures that are designed to protect coasts, beaches, and so on from erosion by reducing the energy of ocean waves. Wave-dissipating blocks usually weigh more than 1 t and are installed by being suspended by a crane, so it would be time-consuming and costly for inexperienced workers to train on-site. In this paper, therefore, a block installation simulator is developed based on Unity 3D, a game development engine. The simulator computes porosity. Porosity is defined as the ratio of the total volume of the wave breaker blocks inside the structure to the final shape of the ideal structure. Using the evaluation of porosity, the simulator can determine how well the user is able to install the blocks. The voxelization technique is used to calculate the porosity of the structure, simplifying the calculations. Other techniques, such as raycasting and box overlapping, are employed for accurate simulation. In the near future, the simulator will install an automatic block installation algorithm based on combinatorial optimization solutions and compare the user-demonstrated block installation and the appropriate installation solved by the algorithm.

Keywords: 3D simulator, porosity, user interface, voxelization, wave-dissipating blocks

Procedia PDF Downloads 103
15677 Optimization of Lead Bioremediation by Marine Halomonas sp. ES015 Using Statistical Experimental Methods

Authors: Aliaa M. El-Borai, Ehab A. Beltagy, Eman E. Gadallah, Samy A. ElAssar

Abstract:

Bioremediation technology is now used for treatment instead of traditional metal removal methods. A strain was isolated from Marsa Alam, Red sea, Egypt showed high resistance to high lead concentration and was identified by the 16S rRNA gene sequencing technique as Halomonas sp. ES015. Medium optimization was carried out using Plackett-Burman design, and the most significant factors were yeast extract, casamino acid and inoculums size. The optimized media obtained by the statistical design raised the removal efficiency from 84% to 99% from initial concentration 250 ppm of lead. Moreover, Box-Behnken experimental design was applied to study the relationship between yeast extract concentration, casamino acid concentration and inoculums size. The optimized medium increased removal efficiency to 97% from initial concentration 500 ppm of lead. Immobilized Halomonas sp. ES015 cells on sponge cubes, using optimized medium in loop bioremediation column, showed relatively constant lead removal efficiency when reused six successive cycles over the range of time interval. Also metal removal efficiency was not affected by flow rate changes. Finally, the results of this research refer to the possibility of lead bioremediation by free or immobilized cells of Halomonas sp. ES015. Also, bioremediation can be done in batch cultures and semicontinuous cultures using column technology.

Keywords: bioremediation, lead, Box–Behnken, Halomonas sp. ES015, loop bioremediation, Plackett-Burman

Procedia PDF Downloads 197
15676 Serum Cortisol and Osteocalsin in Response to Eight Weeks Aerobic Training in Asthma Men with Mild to Moderate Intensity

Authors: Eizadi Mojtaba

Abstract:

This study aimed to evaluate the effect of 8 weeks aerobic training on serum osteocalsin as an osteoblasts hormone and cortisol in adult men with asthma. For this purpose, twenty four non-trained adult men with mild to moderate asthma were participated in study voluntarily and divided into exercise (aerobic training, 8 weeks/3 times per week) and control groups by randomly. Pre and post training of serum osteocalsin and cortisol were measured of two groups. Student’s paired ‘t’ test was applied to compare the pre and post training values. A p-value of less than 0.05 was considered to be statistically significant. There were no statistically significant differences with regard to all anthropometrical and biochemical markers between the exercise and control groups at baseline ( P > 0.05 ). Exercise training resulted in a significant increase in serum osteocalsin and decrease in cortisol ( P > 0.05 ), but not in control group. Based on these data, we concluded that aerobic training can be improved Processes of bone formation in asthma patients.

Keywords: osteoblasts, asthma, aerobic exercise, sedentary

Procedia PDF Downloads 287
15675 Dynamic Fault Diagnosis for Semi-Batch Reactor Under Closed-Loop Control via Independent RBFNN

Authors: Abdelkarim M. Ertiame, D. W. Yu, D. L. Yu, J. B. Gomm

Abstract:

In this paper, a new robust fault detection and isolation (FDI) scheme is developed to monitor a multivariable nonlinear chemical process called the Chylla-Haase polymerization reactor when it is under the cascade PI control. The scheme employs a radial basis function neural network (RBFNN) in an independent mode to model the process dynamics and using the weighted sum-squared prediction error as the residual. The recursive orthogonal Least Squares algorithm (ROLS) is employed to train the model to overcome the training difficulty of the independent mode of the network. Then, another RBFNN is used as a fault classifier to isolate faults from different features involved in the residual vector. The several actuator and sensor faults are simulated in a nonlinear simulation of the reactor in Simulink. The scheme is used to detect and isolate the faults on-line. The simulation results show the effectiveness of the scheme even the process is subjected to disturbances and uncertainties including significant changes in the monomer feed rate, fouling factor, impurity factor, ambient temperature and measurement noise. The simulation results are presented to illustrate the effectiveness and robustness of the proposed method.

Keywords: Robust fault detection, cascade control, independent RBF model, RBF neural networks, Chylla-Haase reactor, FDI under closed-loop control

Procedia PDF Downloads 498
15674 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 336
15673 Survey on Malware Detection

Authors: Doaa Wael, Naswa Abdelbaky

Abstract:

Malware is malicious software that is built to cause destructive actions and damage information systems and networks. Malware infections increase rapidly, and types of malware have become more sophisticated, which makes the malware detection process more difficult. On the other side, the Internet of Things IoT technology is vulnerable to malware attacks. These IoT devices are always connected to the internet and lack security. This makes them easy for hackers to access. These malware attacks are becoming the go-to attack for hackers. Thus, in order to deal with this challenge, new malware detection techniques are needed. Currently, building a blockchain solution that allows IoT devices to download any file from the internet and to verify/approve whether it is malicious or not is the need of the hour. In recent years, blockchain technology has stood as a solution to everything due to its features like decentralization, persistence, and anonymity. Moreover, using blockchain technology overcomes some difficulties in malware detection and improves the malware detection ratio over-than the techniques that do not utilize blockchain technology. In this paper, we study malware detection models which are based on blockchain technology. Furthermore, we elaborate on the effect of blockchain technology in malware detection, especially in the android environment.

Keywords: malware analysis, blockchain, malware attacks, malware detection approaches

Procedia PDF Downloads 87
15672 Effect of Treadmill Exercise on Fluid Intelligence in Early Adults: Electroencephalogram Study

Authors: Ladda Leungratanamart, Seree Chadcham

Abstract:

Fluid intelligence declines along with age, but it can be developed. For this reason, increasing fluid intelligence in young adults can be possible. This study examined the effects of a two-month treadmill exercise program on fluid intelligence. The researcher designed a treadmill exercise program to promote cardiorespiratory fitness. Thirty-eight healthy voluntary students from the Boromarajonani College of Nursing, Chon Buri were assigned randomly to an exercise group (n=18) and a control group (n=20). The experiment consisted of three sessions: The baseline session consisted of measuring the VO2max, electroencephalogram and behavioral response during performed the Raven Progressive Matrices (RPM) test, a measure of fluid intelligence. For the exercise session, an experimental group exercises using treadmill training at 60 % to 80 % maximum heart rate for 30 mins, three times per week, whereas the control group did not exercise. For the following two sessions, each participant was measured the same as baseline testing. The data were analyzed using the t-test to examine whether there is significant difference between the means of the two groups. The results showed that the mean VO2 max in the experimental group were significantly more than the control group (p<.05), suggesting a two-month treadmill exercise program can improve fluid intelligence. When comparing the behavioral data, it was found that experimental group performed RPM test more accurately and faster than the control group. Neuroelectric data indicated a significant increase in percentages of alpha band ERD (%ERD) at P3 and Pz compared to the pre-exercise condition and the control group. These data suggest that a two-month treadmill exercise program can contribute to the development of cardiorespiratory fitness which influences an increase fluid intelligence. Exercise involved in cortical activation in difference brain areas.

Keywords: treadmill exercise, fluid intelligence, raven progressive matrices test, alpha band

Procedia PDF Downloads 350
15671 Stability Optimization of NABH₄ via PH and H₂O:NABH₄ Ratios for Large Scale Hydrogen Production

Authors: Parth Mehta, Vedasri Bai Khavala, Prabhu Rajagopal, Tiju Thomas

Abstract:

There is an increasing need for alternative clean fuels, and hydrogen (H₂) has long been considered a promising solution with a high calorific value (142MJ/kg). However, the storage of H₂ and expensive processes for its generation have hindered its usage. Sodium borohydride (NaBH₄) can potentially be used as an economically viable means of H₂ storage. Thus far, there have been attempts to optimize the life of NaBH₄ (half-life) in aqueous media by stabilizing it with sodium hydroxide (NaOH) for various pH values. Other reports have shown that H₂ yield and reaction kinetics remained constant for all ratios of H₂O to NaBH₄ > 30:1, without any acidic catalysts. Here we highlight the importance of pH and H₂O: NaBH₄ ratio (80:1, 40:1, 20:1 and 10:1 by weight), for NaBH₄ stabilization (half-life reaction time at room temperature) and corrosion minimization of H₂ reactor components. It is interesting to observe that at any particular pH>10 (e.g., pH = 10, 11 and 12), the H₂O: NaBH₄ ratio does not have the expected linear dependence with stability. On the contrary, high stability was observed at the ratio of 10:1 H₂O: NaBH₄ across all pH>10. When the H₂O: NaBH₄ ratio is increased from 10:1 to 20:1 and beyond (till 80:1), constant stability (% degradation) is observed with respect to time. For practical usage (consumption within 6 hours of making NaBH₄ solution), 15% degradation at pH 11 and NaBH₄: H₂O ratio of 10:1 is recommended. Increasing this ratio demands higher NaOH concentration at the same pH, thus requiring a higher concentration or volume of acid (e.g., HCl) for H₂ generation. The reactions are done with tap water to render the results useful from an industrial standpoint. The observed stability regimes are rationalized based on complexes associated with NaBH₄ when solvated in water, which depend sensitively on both pH and NaBH₄: H₂O ratio.

Keywords: hydrogen, sodium borohydride, stability optimization, H₂O:NaBH₄ ratio

Procedia PDF Downloads 120
15670 Distributed Coordination of Connected and Automated Vehicles at Multiple Interconnected Intersections

Authors: Zhiyuan Du, Baisravan Hom Chaudhuri, Pierluigi Pisu

Abstract:

In connected vehicle systems where wireless communication is available among the involved vehicles and intersection controllers, it is possible to design an intersection coordination strategy that leads the connected and automated vehicles (CAVs) travel through the road intersections without the conventional traffic light control. In this paper, we present a distributed coordination strategy for the CAVs at multiple interconnected intersections that aims at improving system fuel efficiency and system mobility. We present a distributed control solution where in the higher level, the intersection controllers calculate the road desired average velocity and optimally assign reference velocities of each vehicle. In the lower level, every vehicle is considered to use model predictive control (MPC) to track their reference velocity obtained from the higher level controller. The proposed method has been implemented on a simulation-based case with two-interconnected intersection network. Additionally, the effects of mixed vehicle types on the coordination strategy has been explored. Simulation results indicate the improvement on vehicle fuel efficiency and traffic mobility of the proposed method.

Keywords: connected vehicles, automated vehicles, intersection coordination systems, multiple interconnected intersections, model predictive control

Procedia PDF Downloads 356
15669 Block N Lvi from the Northern Side of Parthenon Frieze: A Case Study of Augmented Reality for Museum Application

Authors: Donato Maniello, Alessandra Cirafici, Valeria Amoretti

Abstract:

This paper aims to present a new method that consists in the use of video mapping techniques – that is a particular form of augmented reality, which could produce new tools - different from the ones that are actually in use - for an interactive Museum experience. With the words 'augmented reality', we mean the addition of more information than what the visitor would normally perceive; this information is mediated by the use of computer and projector. The proposed application involves the creation of a documentary that depicts and explains the history of the artifact and illustrates its features; this must be projected on the surface of the faithful copy of the freeze (obtained in full-scale with a 3D printer). This mode of operation uses different techniques that allow passing from the creation of the model to the creation of contents through an accurate historical and artistic analysis, and finally to the warping phase, that will permit to overlap real and virtual models. The ultimate step, that is still being studied, includes the creation of interactive contents that would be activated by visitors through appropriate motion sensors.

Keywords: augmented reality, multimedia, parthenon frieze, video mapping

Procedia PDF Downloads 387
15668 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection

Authors: Rubin Dan, Xingcai Wang, Ziyang Chen

Abstract:

A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.

Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising

Procedia PDF Downloads 199
15667 Cultural References in Jean-François Menard's French Translation of Harry Potter a L'ecole Des Sorciers: An Analysis of the Translated Catchphrases and Spells and Cultural Elements

Authors: Brynn Patrice Fader

Abstract:

The objective of this research project is to assess the ways in which Jean-Francois Menards French translation Harry Potter a l'ecole des sorciers translates the cultural references from the original text JK Rowlings' Harry Potter and the Philosophers Stone. The method of this analysis is to focus on analyzing the reasons for and the ways in which Menard translates the spells and catchphrases throughout the novel and the effects that these choices have on the reader. While at times Menard resorts to the omission or manipulation and borrowing he also contrasts these techniques by transferring the cultural references using the direct translational approach. It appears that the translator resorts to techniques other than direct translation when it is necessary to ensure that the target audience will understand the events and conversations taking place.

Keywords: cultural elements, direct translation, manipulation, omission

Procedia PDF Downloads 317
15666 Detention Experiences of Asylum Seeking Children in Canada: An Interpretative Phenomenological Analysis

Authors: Zohra Faize

Abstract:

Globalization has expanded the mobility privileges of the Global North population while simultaneously, those in the Global South, namely poor, and racialized minorities are increasingly criminalized for crossing international borders. As part of this global trend, Canada also engages in tight border control practices, which often result in marginalization and criminalization of asylum seekers, including children. Using Interpretative Phenomenological Analysis as a theoretical framework and methodology, this research explores the effects of tight border control practices on children asylum-seekers; with a specific focus on detention experiences in Canadian prisons and immigration Holding Centers. The preliminary results of interviews with 8 participants confirm the violations of child rights that stem from the detention practice. Children also report that they find immigration detention to be a stressful and a confusing experience, often resulting in feeling of shame and guilt after their release into the community.

Keywords: border control, crimmigration, Canada, children asylum seekers, immcarceration, interpretative phenomenological analysis (IPA)

Procedia PDF Downloads 298
15665 BodeACD: Buffer Overflow Vulnerabilities Detecting Based on Abstract Syntax Tree, Control Flow Graph, and Data Dependency Graph

Authors: Xinghang Lv, Tao Peng, Jia Chen, Junping Liu, Xinrong Hu, Ruhan He, Minghua Jiang, Wenli Cao

Abstract:

As one of the most dangerous vulnerabilities, effective detection of buffer overflow vulnerabilities is extremely necessary. Traditional detection methods are not accurate enough and consume more resources to meet complex and enormous code environment at present. In order to resolve the above problems, we propose the method for Buffer overflow detection based on Abstract syntax tree, Control flow graph, and Data dependency graph (BodeACD) in C/C++ programs with source code. Firstly, BodeACD constructs the function samples of buffer overflow that are available on Github, then represents them as code representation sequences, which fuse control flow, data dependency, and syntax structure of source code to reduce information loss during code representation. Finally, BodeACD learns vulnerability patterns for vulnerability detection through deep learning. The results of the experiments show that BodeACD has increased the precision and recall by 6.3% and 8.5% respectively compared with the latest methods, which can effectively improve vulnerability detection and reduce False-positive rate and False-negative rate.

Keywords: vulnerability detection, abstract syntax tree, control flow graph, data dependency graph, code representation, deep learning

Procedia PDF Downloads 170
15664 Management Competency in Logistical Function: The Skills That Will Master a Logistical Manager

Authors: Fatima Ibnchahid

Abstract:

Competence approach is considered, since the early 80's as one of the major development of HR policies. Many approaches to manage the professional skills were declined. Some processes are mature whereas the others have been abandoned. Competence can be defined as the set of knowledge (theoretical and practical), know-how (experience) and life skills (personality traits) mobilized by a person in the company. The skills must master a logistics manager are divided into two main categories: depending on whether technical skills, or managerial skills and human. The firsts are broken down into skills on logistical techniques and on general skills in business, seconds in social skills (self with others) and personal (with oneself). Logisticians are faced with new challenges and new constraints that are revolutionizing the way to treat the physical movement of goods and operations related to information flows that trigger, they control and guide the physical movements of these major changes, we can mention the development of information technology and communication, the emergence of strong environmental and security constraints. These changes have important effects on the skills needs of the members of the logistical function and sensitive development for training requested by logistical managers to perform better in their job changes. In this article, we will address two main points, first, a brief overview of the management skills and secondly answer the question asked in the title of the article to know what are the skills that will master a logistical manager.

Keywords: skills, competence, management, logistical function

Procedia PDF Downloads 282
15663 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: software quality, fuzzy logic, perception, prediction

Procedia PDF Downloads 317
15662 Optimizing Energy Efficiency: Leveraging Big Data Analytics and AWS Services for Buildings and Industries

Authors: Gaurav Kumar Sinha

Abstract:

In an era marked by increasing concerns about energy sustainability, this research endeavors to address the pressing challenge of energy consumption in buildings and industries. This study delves into the transformative potential of AWS services in optimizing energy efficiency. The research is founded on the recognition that effective management of energy consumption is imperative for both environmental conservation and economic viability. Buildings and industries account for a substantial portion of global energy use, making it crucial to develop advanced techniques for analysis and reduction. This study sets out to explore the integration of AWS services with big data analytics to provide innovative solutions for energy consumption analysis. Leveraging AWS's cloud computing capabilities, scalable infrastructure, and data analytics tools, the research aims to develop efficient methods for collecting, processing, and analyzing energy data from diverse sources. The core focus is on creating predictive models and real-time monitoring systems that enable proactive energy management. By harnessing AWS's machine learning and data analytics capabilities, the research seeks to identify patterns, anomalies, and optimization opportunities within energy consumption data. Furthermore, this study aims to propose actionable recommendations for reducing energy consumption in buildings and industries. By combining AWS services with metrics-driven insights, the research strives to facilitate the implementation of energy-efficient practices, ultimately leading to reduced carbon emissions and cost savings. The integration of AWS services not only enhances the analytical capabilities but also offers scalable solutions that can be customized for different building and industrial contexts. The research also recognizes the potential for AWS-powered solutions to promote sustainable practices and support environmental stewardship.

Keywords: energy consumption analysis, big data analytics, AWS services, energy efficiency

Procedia PDF Downloads 64
15661 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 323
15660 Artificial Intelligence for Traffic Signal Control and Data Collection

Authors: Reggie Chandra

Abstract:

Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.

Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal

Procedia PDF Downloads 169
15659 A Robust Optimization of Chassis Durability/Comfort Compromise Using Chebyshev Polynomial Chaos Expansion Method

Authors: Hanwei Gao, Louis Jezequel, Eric Cabrol, Bernard Vitry

Abstract:

The chassis system is composed of complex elements that take up all the loads from the tire-ground contact area and thus it plays an important role in numerous specifications such as durability, comfort, crash, etc. During the development of new vehicle projects in Renault, durability validation is always the main focus while deployment of comfort comes later in the project. Therefore, sometimes design choices have to be reconsidered because of the natural incompatibility between these two specifications. Besides, robustness is also an important point of concern as it is related to manufacturing costs as well as the performance after the ageing of components like shock absorbers. In this paper an approach is proposed aiming to realize a multi-objective optimization between chassis endurance and comfort while taking the random factors into consideration. The adaptive-sparse polynomial chaos expansion method (PCE) with Chebyshev polynomial series has been applied to predict responses’ uncertainty intervals of a system according to its uncertain-but-bounded parameters. The approach can be divided into three steps. First an initial design of experiments is realized to build the response surfaces which represent statistically a black-box system. Secondly within several iterations an optimum set is proposed and validated which will form a Pareto front. At the same time the robustness of each response, served as additional objectives, is calculated from the pre-defined parameter intervals and the response surfaces obtained in the first step. Finally an inverse strategy is carried out to determine the parameters’ tolerance combination with a maximally acceptable degradation of the responses in terms of manufacturing costs. A quarter car model has been tested as an example by applying the road excitations from the actual road measurements for both endurance and comfort calculations. One indicator based on the Basquin’s law is defined to compare the global chassis durability of different parameter settings. Another indicator related to comfort is obtained from the vertical acceleration of the sprung mass. An optimum set with best robustness has been finally obtained and the reference tests prove a good robustness prediction of Chebyshev PCE method. This example demonstrates the effectiveness and reliability of the approach, in particular its ability to save computational costs for a complex system.

Keywords: chassis durability, Chebyshev polynomials, multi-objective optimization, polynomial chaos expansion, ride comfort, robust design

Procedia PDF Downloads 152
15658 Brazilian Constitution and the Fundamental Right to Sanitation

Authors: Michely Vargas Delpupo, José Geraldo Romanello Bueno

Abstract:

The right to basic sanitation, was elevated to the category of fundamental right by the Brazilian Constitution of 1988 to protect the ecologically balanced environment, ensuring social rights to health and adequate housing warranting dignity of the human person as a principle of the Brazilian Democratic State. Because of their essentiality to the Brazilian population, this article seeks to understand why universal access to basic sanitation is a goal so difficult to achieve in Brazil. Therefore, this research uses the deductive and analytical method. Given the nature of the research literature, research techniques were centered in specialized books on the subject, journals, theses and dissertations, laws, relevant law case and raising social indicators relating to the theme. The relevance of the topic stems, among other things, the fact that sanitation services are essential for a dignified life, i.e. everyone is entitled to the maintenance of the necessary existence conditions are satisfied. However, the effectiveness of this right is undermined in society, since Brazil has huge deficit in sanitation services, denying thus a worthy life to most of the population. Thus, it can be seen that the provision of water and sewage services in Brazil is still characterized by a large imbalance, since the municipalities with lower population index have greater disability in the sanitation service. The truth is that the precariousness of water and sewage services in Brazil is still very concentrated in the North and Northeast regions, limiting the effective implementation of the Law 11.445/2007 in the country. Therefore, there is urgent need for a positive service by the State in the provision of sanitation services in order to prevent and control disease, improve quality of life and productivity of individuals, besides preventing contamination of water resources. More than just social and economic necessity, there is even a an obligation of the government to implement such services. In this sense, given the current scenario, to achieve universal access to basic sanitation imposes many hurdles. These are mainly in the field of properly formulated and implemented public policies, i.e. it requires an excellent institutional organization, management services, strategic planning, social control, in order to provide answers to complex challenges.

Keywords: fundamental rights, health, sanitation, universal access

Procedia PDF Downloads 411
15657 Influence of Drying Method in Parts of Alumina Obtained for Rapid Prototyping and Uniaxial Dry Pressing

Authors: N. O. Muniz, F. A. Vechietti, L. Treccani, K. Rezwan, Luis Alberto dos Santos

Abstract:

Developing new technologies in the manufacture of biomaterials is a major challenge for researchers in the tissue engineering area. Many in vitro and in vivo studies have revealed the significance of the porous structure of the biomaterials on the promotion of bone ingrowth. The use of Rapid Prototyping in the manufacture of ceramics in the biomedical area has increased in recent years and few studies are conducted on obtaining alumina pieces. The aim of this work was the study of alumina pieces obtained by 3D printing and uniaxial dry pressing (DP) in order to evaluate porosity achieved by this two different techniques. Also, the influence of the powder drying process was determined. The row alumina powders were drying by freeze drying and oven. Apparent porosity, apparent density, retraction after thermal treatment were evaluated. The porosity values obtained by DP, regardless of method of drying powders, were much lower than those obtained by RP as expected. And for the prototyped samples, the method of powder drying significantly influenced porosities, reached 48% for drying oven versus 65% for freeze-drying. Therefore, the method of 3D printing, using different powder drying, allows a better control over the porosity.

Keywords: rapid prototyping, freeze-drying, porosity, alumina

Procedia PDF Downloads 472
15656 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review

Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie

Abstract:

The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.

Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield

Procedia PDF Downloads 60
15655 Collaborative Management Approach for Logistics Flow Management of Cuban Medicine Supply Chain

Authors: Ana Julia Acevedo Urquiaga, Jose A. Acevedo Suarez, Ana Julia Urquiaga Rodriguez, Neyfe Sablon Cossio

Abstract:

Despite the progress made in logistics and supply chains fields, it is unavoidable the development of business models that use efficiently information to facilitate the integrated logistics flows management between partners. Collaborative management is an important tool for materializing the cooperation between companies, as a way to achieve the supply chain efficiency and effectiveness. The first face of this research was a comprehensive analysis of the collaborative planning on the Cuban companies. It is evident that they have difficulties in supply chains planning where production, supplies and replenishment planning are independent tasks, as well as logistics and distribution operations. Large inventories generate serious financial and organizational problems for entities, demanding increasing levels of working capital that cannot be financed. Problems were found in the efficient application of Information and Communication Technology on business management. The general objective of this work is to develop a methodology that allows the deployment of a planning and control system in a coordinated way on the medicine’s logistics system in Cuba. To achieve these objectives, several mechanisms of supply chain coordination, mathematical programming models, and other management techniques were analyzed to meet the requirements of collaborative logistics management in Cuba. One of the findings is the practical and theoretical inadequacies of the studied models to solve the current situation of the Cuban logistics systems management. To contribute to the tactical-operative management of logistics, the Collaborative Logistics Flow Management Model (CLFMM) is proposed as a tool for the balance of cycles, capacities, and inventories, always to meet the final customers’ demands in correspondence with the service level expected by these. The CLFMM has as center the supply chain planning and control system as a unique information system, which acts on the processes network. The development of the model is based on the empirical methods of analysis-synthesis and the study cases. Other finding is the demonstration of the use of a single information system to support the supply chain logistics management, allows determining the deadlines and quantities required in each process. This ensures that medications are always available to patients and there are no faults that put the population's health at risk. The simulation of planning and control with the CLFMM in medicines such as dipyrone and chlordiazepoxide, during 5 months of 2017, permitted to take measures to adjust the logistic flow, eliminate delayed processes and avoid shortages of the medicines studied. As a result, the logistics cycle efficiency can be increased to 91%, the inventory rotation would increase, and this results in a release of financial resources.

Keywords: collaborative management, medicine logistic system, supply chain planning, tactical-operative planning

Procedia PDF Downloads 176
15654 Interpolation Issue in PVNPG-14M Application for Technical Control of Artillery Fire

Authors: Martin Blaha, Ladislav Potužák, Daniel Holesz

Abstract:

This paper focused on application support for technical control of artillery units – PVNPG-14M, especially on interpolation issue. Artillery units of the Army of the Czech Republic, reflecting the current global security neighborhood, can be used outside the Czech Republic. The paper presents principles, evolution and calculation in the process of complete preparation. The paper presents expertise using of application of current artillery communication and information system and suggests the perspective future system. The paper also presents problems in process of complete preparing of fire especially problems in permanently information (firing table) and calculated values. The paper presents problems of current artillery communication and information system and suggests requirements of the future system.

Keywords: Fire for Effect, Application, Fire Control, Interpolation method, Software development.

Procedia PDF Downloads 322
15653 Finite Element Modeling Techniques of Concrete in Steel and Concrete Composite Members

Authors: J. Bartus, J. Odrobinak

Abstract:

The paper presents a nonlinear analysis 3D model of composite steel and concrete beams with web openings using the Finite Element Method (FEM). The core of the study is the introduction of basic modeling techniques comprehending the description of material behavior, appropriate elements selection, and recommendations for overcoming problems with convergence. Results from various finite element models are compared in the study. The main objective is to observe the concrete failure mechanism and its influence on the structural performance of numerical models of the beams at particular load stages. The bearing capacity of beams, corresponding deformations, stresses, strains, and fracture patterns were determined. The results show how load-bearing elements consisting of concrete parts can be analyzed using FEM software with various options to create the most suitable numerical model. The paper demonstrates the versatility of Ansys software usage for structural simulations.

Keywords: Ansys, concrete, modeling, steel

Procedia PDF Downloads 121
15652 Nanoparticles Modification by Grafting Strategies for the Development of Hybrid Nanocomposites

Authors: Irati Barandiaran, Xabier Velasco-Iza, Galder Kortaberria

Abstract:

Hybrid inorganic/organic nanostructured materials based on block copolymers are of considerable interest in the field of Nanotechnology, taking into account that these nanocomposites combine the properties of polymer matrix and the unique properties of the added nanoparticles. The use of block copolymers as templates offers the opportunity to control the size and the distribution of inorganic nanoparticles. This research is focused on the surface modification of inorganic nanoparticles to reach a good interface between nanoparticles and polymer matrices which hinders the nanoparticle aggregation. The aim of this work is to obtain a good and selective dispersion of Fe3O4 magnetic nanoparticles into different types of block copolymers such us, poly(styrene-b-methyl methacrylate) (PS-b-PMMA), poly(styrene-b-ε-caprolactone) (PS-b-PCL) poly(isoprene-b-methyl methacrylate) (PI-b-PMMA) or poly(styrene-b-butadiene-b-methyl methacrylate) (SBM) by using different grafting strategies. Fe3O4 magnetic nanoparticles have been surface-modified with polymer or block copolymer brushes following different grafting methods (grafting to, grafting from and grafting through) to achieve a selective location of nanoparticles into desired domains of the block copolymers. Morphology of fabricated hybrid nanocomposites was studied by means of atomic force microscopy (AFM) and with the aim to reach well-ordered nanostructured composites different annealing methods were used. Additionally, nanoparticle amount has been also varied in order to investigate the effect of the nanoparticle content in the morphology of the block copolymer. Nowadays different characterization methods were using in order to investigate magnetic properties of nanometer-scale electronic devices. Particularly, two different techniques have been used with the aim of characterizing synthesized nanocomposites. First, magnetic force microscopy (MFM) was used to investigate qualitatively the magnetic properties taking into account that this technique allows distinguishing magnetic domains on the sample surface. On the other hand, magnetic characterization by vibrating sample magnetometer and superconducting quantum interference device. This technique demonstrated that magnetic properties of nanoparticles have been transferred to the nanocomposites, exhibiting superparamagnetic behavior similar to that of the maghemite nanoparticles at room temperature. Obtained advanced nanostructured materials could found possible applications in the field of dye-sensitized solar cells and electronic nanodevices.

Keywords: atomic force microscopy, block copolymers, grafting techniques, iron oxide nanoparticles

Procedia PDF Downloads 262
15651 Simulation and Analysis of Mems-Based Flexible Capacitive Pressure Sensors with COMSOL

Authors: Ding Liangxiao

Abstract:

The technological advancements in Micro-Electro-Mechanical Systems (MEMS) have significantly contributed to the development of new, flexible capacitive pressure sensors,which are pivotal in transforming wearable and medical device technologies. This study employs the sophisticated simulation tools available in COMSOL Multiphysics® to develop and analyze a MEMS-based sensor with a tri-layered design. This sensor comprises top and bottom electrodes made from gold (Au), noted for their excellent conductivity, a middle dielectric layer made from a composite of Silver Nanowires (AgNWs) embedded in Thermoplastic Polyurethane (TPU), and a flexible, durable substrate of Polydimethylsiloxane (PDMS). This research was directed towards understanding how changes in the physical characteristics of the AgNWs/TPU dielectric layer—specifically, its thickness and surface area—impact the sensor's operational efficacy. We assessed several key electrical properties: capacitance, electric potential, and membrane displacement under varied pressure conditions. These investigations are crucial for enhancing the sensor's sensitivity and ensuring its adaptability across diverse applications, including health monitoring systems and dynamic user interface technologies. To ensure the reliability of our simulations, we applied the Effective Medium Theory to calculate the dielectric constant of the AgNWs/TPU composite accurately. This approach is essential for predicting how the composite material will perform under different environmental and operational stresses, thus facilitating the optimization of the sensor design for enhanced performance and longevity. Moreover, we explored the potential benefits of innovative three-dimensional structures for the dielectric layer compared to traditional flat designs. Our hypothesis was that 3D configurations might improve the stress distribution and optimize the electrical field interactions within the sensor, thereby boosting its sensitivity and accuracy. Our simulation protocol includes comprehensive performance testing under simulated environmental conditions, such as temperature fluctuations and mechanical pressures, which mirror the actual operational conditions. These tests are crucial for assessing the sensor's robustness and its ability to function reliably over extended periods, ensuring high reliability and accuracy in complex real-world environments. In our current research, although a full dynamic simulation analysis of the three-dimensional structures has not yet been conducted, preliminary explorations through three-dimensional modeling have indicated the potential for mechanical and electrical performance improvements over traditional planar designs. These initial observations emphasize the potential advantages and importance of incorporating advanced three-dimensional modeling techniques in the development of Micro-Electro-Mechanical Systems (MEMS)sensors, offering new directions for the design and functional optimization of future sensors. Overall, this study not only highlights the powerful capabilities of COMSOL Multiphysics® for modeling sophisticated electronic devices but also underscores the potential of innovative MEMS technology in advancing the development of more effective, reliable, and adaptable sensor solutions for a broad spectrum of technological applications.

Keywords: MEMS, flexible sensors, COMSOL Multiphysics, AgNWs/TPU, PDMS, 3D modeling, sensor durability

Procedia PDF Downloads 45
15650 Effect of Scaling and Root Planing on Improvement of Glycemic Control in Periodontitis Patients with Type-2 Diabetes Mellitus

Authors: Shivalal Sharma, Sanjib K. Sharma, Madhab Lamsal

Abstract:

Background: The aim of this study was to evaluate the clinical and laboratory changes three months after full-mouth scaling and root planing (SRP) in periodontitis patients with type 2 diabetes mellitus (DM). Methods: Forty-seven type 2 DM subjects with moderate to severe periodontitis were randomly divided into two groups. Treatment group (TG), 25 subjects, received full-mouth scaling and root planning; control group (CG), 22 subjects, received no treatment. At baseline and at the end of three months, glycated hemoglobin (HbA1c) values, fasting glucose, and clinical parameters like plaque index (PI), gingival index (GI), probing pocket depth (PPD), and clinical attachment level (CAL) were recorded in all the patients. Following SRP, the patients were enrolled in a monthly interval maintenance program for 3 months. Results: A statistically significant effect could be demonstrated for PI, GI, PPD, and CAL for the treatment group. HbA1c levels in the treatment group decreased significantly whereas the control group showed a slight but insignificant increase for these parameters. Conclusions: The results of this study showed that non-surgical periodontal treatment (SRP) is associated with improved glycemic control in type 2 DM patients and could be undertaken along with the standard measures for the diabetic patient care.

Keywords: periodontitis, type 2 diabetes mellitus, non-surgical periodontal therapy, SRP

Procedia PDF Downloads 301
15649 Adsorption Kinetics and Equilibria at an Air-Liquid Interface of Biosurfactant and Synthetic Surfactant

Authors: Sagheer A. Onaizi

Abstract:

The adsorption of anionic biosurfactant (surfactin) and anionic synthetic surfactant (sodium dodecylbenzenesulphonate, abbreviated as SDOBS) from phosphate buffer containing high concentrations of co- and counter-ions to the air-buffer interface has been investigated. The self-assembly of the two surfactants at the interface has been monitored through dynamic surface tension measurements. The equilibrium surface pressure-surfactant concentration data in the premicellar region were regressed using Gibbs adsorption equation. The predicted surface saturations for SDOBS and surfactin are and, respectively. The occupied area per an SDOBS molecule at the interface saturation condition is while that occupied by a surfactin molecule is. The surface saturations reported in this work for both surfactants are in a very good agreement with those obtained using expensive techniques such as neutron reflectometry, suggesting that the surface tension measurements coupled with appropriate theoretical analysis could provide useful information comparable to those obtained using highly sophisticated techniques.

Keywords: adsorption, air-liquid interface, biosurfactant, surface tension

Procedia PDF Downloads 713