Search results for: solution validation
4446 The Human Rights Code: Fundamental Rights as the Basis of Human-Robot Coexistence
Authors: Gergely G. Karacsony
Abstract:
Fundamental rights are the result of thousand years’ progress of legislation, adjudication and legal practice. They serve as the framework of peaceful cohabitation of people, protecting the individual from any abuse by the government or violation by other people. Artificial intelligence, however, is the development of the very recent past, being one of the most important prospects to the future. Artificial intelligence is now capable of communicating and performing actions the same way as humans; such acts are sometimes impossible to tell from actions performed by flesh-and-blood people. In a world, where human-robot interactions are more and more common, a new framework of peaceful cohabitation is to be found. Artificial intelligence, being able to take part in almost any kind of interaction where personal presence is not necessary without being recognized as a non-human actor, is now able to break the law, violate people’s rights, and disturb social peace in many other ways. Therefore, a code of peaceful coexistence is to be found or created. We should consider the issue, whether human rights can serve as the code of ethical and rightful conduct in the new era of artificial intelligence and human coexistence. In this paper, we will examine the applicability of fundamental rights to human-robot interactions as well as to the actions of artificial intelligence performed without human interaction whatsoever. Robot ethics has been a topic of discussion and debate of philosophy, ethics, computing, legal sciences and science fiction writing long before the first functional artificial intelligence has been introduced. Legal science and legislation have approached artificial intelligence from different angles, regulating different areas (e.g. data protection, telecommunications, copyright issues), but they are only chipping away at the mountain of legal issues concerning robotics. For a widely acceptable and permanent solution, a more general set of rules would be preferred to the detailed regulation of specific issues. We argue that human rights as recognized worldwide are able to be adapted to serve as a guideline and a common basis of coexistence of robots and humans. This solution has many virtues: people don’t need to adjust to a completely unknown set of standards, the system has proved itself to withstand the trials of time, legislation is easier, and the actions of non-human entities are more easily adjudicated within their own framework. In this paper we will examine the system of fundamental rights (as defined in the most widely accepted source, the 1966 UN Convention on Human Rights), and try to adapt each individual right to the actions of artificial intelligence actors; in each case we will examine the possible effects on the legal system and the society of such an approach, finally we also examine its effect on the IT industry.Keywords: human rights, robot ethics, artificial intelligence and law, human-robot interaction
Procedia PDF Downloads 2444445 Artificial Neural Network Based Approach in Prediction of Potential Water Pollution Across Different Land-Use Patterns
Authors: M.Rüştü Karaman, İsmail İşeri, Kadir Saltalı, A.Reşit Brohi, Ayhan Horuz, Mümin Dizman
Abstract:
Considerable relations has recently been given to the environmental hazardous caused by agricultural chemicals such as excess fertilizers. In this study, a neural network approach was investigated in the prediction of potential nitrate pollution across different land-use patterns by using a feedforward multilayered computer model of artificial neural network (ANN) with proper training. Periodical concentrations of some anions, especially nitrate (NO3-), and cations were also detected in drainage waters collected from the drain pipes placed in irrigated tomato field, unirrigated wheat field, fallow and pasture lands. The soil samples were collected from the irrigated tomato field and unirrigated wheat field on a grid system with 20 m x 20 m intervals. Site specific nitrate concentrations in the soil samples were measured for ANN based simulation of nitrate leaching potential from the land profiles. In the application of ANN model, a multi layered feedforward was evaluated, and data sets regarding with training, validation and testing containing the measured soil nitrate values were estimated based on spatial variability. As a result of the testing values, while the optimal structures of 2-15-1 was obtained (R2= 0.96, P < 0.01) for unirrigated field, the optimal structures of 2-10-1 was obtained (R2= 0.96, P < 0.01) for irrigated field. The results showed that the ANN model could be successfully used in prediction of the potential leaching levels of nitrate, based on different land use patterns. However, for the most suitable results, the model should be calibrated by training according to different NN structures depending on site specific soil parameters and varied agricultural managements.Keywords: artificial intelligence, ANN, drainage water, nitrate pollution
Procedia PDF Downloads 3104444 Complaint Management Mechanism: A Workplace Solution in Development Sector of Bangladesh
Authors: Nusrat Zabeen Islam
Abstract:
Partnership between local Non-Government organizations (NGO) and International development organizations has become an important feature in the development sector of Bangladesh. It is an important challenge for International development organizations to work with local NGOs with proper HR practice. Local NGOs have a lack of quality working environment and this affects the employee’s work experiences and overall performance at individual, partnership with International development organizations and organizational level. Many local development organizations due to the size of the organization and scope do not have a human resource (HR) unit. Inadequate Human Resource Policies, skills, leadership and lack of effective strategy is now a common scenario in Non-Government organization sector of Bangladesh. So corruption, nepotism, and fraud, risk of Political Contribution in office /work space, Sexual/ gender based abuse, insecurity take place in work place of development sector. The Complaint Management Mechanism (CMM) in human resource management could be one way to improve human resource competence in these organizations. The responsibility of Complaint Management Unit (CMU) of an International development organization is to make workplace maltreating, discriminating communities free. The information of impact of CMM was collected through case study of an International organization and some of its partner national organizations in Bangladesh who are engaged in different projects/programs. In this mechanism International development organizations collect complaints from beneficiaries/ staffs by complaint management unit and investigate by segregating the type and mood of the complaint and find out solution to improve the situation within a very short period. A complaint management committee is formed jointly with HR and management personnel. Concerned focal point collect complaints and share with CM unit. By conducting investigation, review of findings, reply back to CM unit and implementation of resolution through this mechanism, a successful bridge of communication and feedback can be established within beneficiaries, staffs and upper management. The overall result of Complaint management mechanism application indicates that by applying CMM accountability and transparency of workplace and workforce in development organization can be increased significantly. Evaluations based on outcomes, and measuring indicators such as productivity, satisfaction, retention, gender equity, proper judgment will guide organizations in building a healthy workforce, and will also clearly articulate the return on investment and justify any need for further funding.Keywords: human resource management in NGOs, challenges in human resource, workplace environment, complaint management mechanism
Procedia PDF Downloads 3224443 Validation of the Recovery of House Dust Mites from Fabrics by Means of Vacuum Sampling
Authors: A. Aljohani, D. Burke, D. Clarke, M. Gormally, M. Byrne, G. Fleming
Abstract:
Introduction: House Dust Mites (HDMs) are a source of allergen particles embedded in textiles and furnishings. Vacuum sampling is commonly used to recover and determine the abundance of HDMs but the efficiency of this method is less than standardized. Here, the efficiency of recovery of HDMs was evaluated from home-associated textiles using vacuum sampling protocols.Methods/Approach: Living Mites (LMs) or dead Mites (DMs) House Dust Mites (Dermatophagoides pteronyssinus: FERA, UK) were separately seeded onto the surfaces of Smooth Cotton, Denim and Fleece (25 mites/10x10cm2 squares) and left for 10 minutes before vacuuming. Fabrics were vacuumed (SKC Flite 2 pump) at a flow rate of 14 L/min for 60, 90 or 120 seconds and the number of mites retained by the filter (0.4μm x 37mm) unit was determined. Vacuuming was carried out in a linear direction (Protocol 1) or in a multidirectional pattern (Protocol 2). Additional fabrics with LMs were also frozen and then thawed, thereby euthanizing live mites (now termed EMs). Results/Findings: While there was significantly greater (p=0.000) recovery of mites (76% greater) in fabrics seeded with DMs than LMs irrespective of vacuuming protocol or fabric type, the efficiency of recovery of DMs (72%-76%) did not vary significantly between fabrics. For fabrics containing EMs, recovery was greatest for Smooth Cotton and Denim (65-73% recovered) and least for Fleece (15% recovered). There was no significant difference (p=0.99) between the recovery of mites across all three mite categories from Smooth Cotton and Denim but significantly fewer (p=0.000) mites were recovered from Fleece. Scanning Electron Microscopy images of HMD-seeded fabrics showed that live mites burrowed deeply into the Fleece weave which reduced their efficiency of recovery by vacuuming. Research Implications: Results presented here have implications for the recovery of HDMs by vacuuming and the choice of fabric to ameliorate HDM-dust sensitization.Keywords: allergy, asthma, dead, fabric, fleece, live mites, sampling
Procedia PDF Downloads 1394442 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation
Authors: Mohammad Abu-Shaira, Weishi Shi
Abstract:
Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression
Procedia PDF Downloads 134441 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis
Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan
Abstract:
Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis
Procedia PDF Downloads 884440 Classifying Affective States in Virtual Reality Environments Using Physiological Signals
Authors: Apostolos Kalatzis, Ashish Teotia, Vishnunarayan Girishan Prabhu, Laura Stanley
Abstract:
Emotions are functional behaviors influenced by thoughts, stimuli, and other factors that induce neurophysiological changes in the human body. Understanding and classifying emotions are challenging as individuals have varying perceptions of their environments. Therefore, it is crucial that there are publicly available databases and virtual reality (VR) based environments that have been scientifically validated for assessing emotional classification. This study utilized two commercially available VR applications (Guided Meditation VR™ and Richie’s Plank Experience™) to induce acute stress and calm state among participants. Subjective and objective measures were collected to create a validated multimodal dataset and classification scheme for affective state classification. Participants’ subjective measures included the use of the Self-Assessment Manikin, emotional cards and 9 point Visual Analogue Scale for perceived stress, collected using a Virtual Reality Assessment Tool developed by our team. Participants’ objective measures included Electrocardiogram and Respiration data that were collected from 25 participants (15 M, 10 F, Mean = 22.28 4.92). The features extracted from these data included heart rate variability components and respiration rate, both of which were used to train two machine learning models. Subjective responses validated the efficacy of the VR applications in eliciting the two desired affective states; for classifying the affective states, a logistic regression (LR) and a support vector machine (SVM) with a linear kernel algorithm were developed. The LR outperformed the SVM and achieved 93.8%, 96.2%, 93.8% leave one subject out cross-validation accuracy, precision and recall, respectively. The VR assessment tool and data collected in this study are publicly available for other researchers.Keywords: affective computing, biosignals, machine learning, stress database
Procedia PDF Downloads 1424439 Strategic Shear Wall Arrangement in Buildings under Seismic Loads
Authors: Akram Khelaifia, Salah Guettala, Nesreddine Djafar Henni, Rachid Chebili
Abstract:
Reinforced concrete shear walls are pivotal in protecting buildings from seismic forces by providing strength and stiffness. This study highlights the importance of strategically placing shear walls and optimizing the shear wall-to-floor area ratio in building design. Nonlinear analyses were conducted on an eight-story building situated in a high seismic zone, exploring various scenarios of shear wall positioning and ratios to floor area. Employing the performance-based seismic design (PBSD) approach, the study aims to meet acceptance criteria such as inter-story drift ratio and damage levels. The results indicate that concentrating shear walls in the middle of the structure during the design phase yields superior performance compared to peripheral distributions. Utilizing shear walls that fully infill the frame and adopting compound shapes (e.g., Box, U, and L) enhances reliability in terms of inter-story drift. Conversely, the absence of complete shear walls within the frame leads to decreased stiffness and degradation of shorter beams. Increasing the shear wall-to-floor area ratio in building design enhances structural rigidity and reliability regarding inter-story drift, facilitating the attainment of desired performance levels. The study suggests that a shear wall ratio of 1.0% is necessary to meet validation criteria for inter-story drift and structural damage, as exceeding this percentage leads to excessive performance levels, proving uneconomical as structural elements operate near the elastic range.Keywords: nonlinear analyses, pushover analysis, shear wall, plastic hinge, performance level
Procedia PDF Downloads 504438 Evaluation of a Method for the Virtual Design of a Software-based Approach for Electronic Fuse Protection in Automotive Applications
Authors: Dominic Huschke, Rudolf Keil
Abstract:
New driving functionalities like highly automated driving have a major impact on the electrics/electronics architecture of future vehicles and inevitably lead to higher safety requirements. Partly due to these increased requirements, the vehicle industry is increasingly looking at semiconductor switches as an alternative to conventional melting fuses. The protective functionality of semiconductor switches can be implemented in hardware as well as in software. A current approach discussed in science and industry is the implementation of a model of the protected low voltage power cable on a microcontroller to calculate its temperature. Here, the information regarding the current is provided by the continuous current measurement of the semiconductor switch. The signal to open the semiconductor switch is provided by the microcontroller when a previously defined limit for the temperature of the low voltage power cable is exceeded. A setup for the testing of the described principle for electronic fuse protection of a low voltage power cable is built and successfullyvalidated with experiments afterwards. Here, the evaluation criterion is the deviation of the measured temperature of the low voltage power cable from the specified limit temperature when the semiconductor switch is opened. The analysis is carried out with an assumed ambient temperature as well as with a measured ambient temperature. Subsequently, the experimentally performed investigations are simulated in a virtual environment. The explicit focus is on the simulation of the behavior of the microcontroller with an implemented model of a low voltage power cable in a real-time environment. Subsequently, the generated results are compared with those of the experiments. Based on this, the completely virtual design of the described approach is assumed to be valid.Keywords: automotive wire harness, electronic fuse protection, low voltage power cable, semiconductor-based fuses, software-based validation
Procedia PDF Downloads 1054437 Curve Fitting by Cubic Bezier Curves Using Migrating Birds Optimization Algorithm
Authors: Mitat Uysal
Abstract:
A new met heuristic optimization algorithm called as Migrating Birds Optimization is used for curve fitting by rational cubic Bezier Curves. This requires solving a complicated multivariate optimization problem. In this study, the solution of this optimization problem is achieved by Migrating Birds Optimization algorithm that is a powerful met heuristic nature-inspired algorithm well appropriate for optimization. The results of this study show that the proposed method performs very well and being able to fit the data points to cubic Bezier Curves with a high degree of accuracy.Keywords: algorithms, Bezier curves, heuristic optimization, migrating birds optimization
Procedia PDF Downloads 3374436 The Artificial Intelligence Technologies Used in PhotoMath Application
Authors: Tala Toonsi, Marah Alagha, Lina Alnowaiser, Hala Rajab
Abstract:
This report is about the Photomath app, which is an AI application that uses image recognition technology, specifically optical character recognition (OCR) algorithms. The (OCR) algorithm translates the images into a mathematical equation, and the app automatically provides a step-by-step solution. The application supports decimals, basic arithmetic, fractions, linear equations, and multiple functions such as logarithms. Testing was conducted to examine the usage of this app, and results were collected by surveying ten participants. Later, the results were analyzed. This paper seeks to answer the question: To what level the artificial intelligence features are accurate and the speed of process in this app. It is hoped this study will inform about the efficiency of AI in Photomath to the users.Keywords: photomath, image recognition, app, OCR, artificial intelligence, mathematical equations.
Procedia PDF Downloads 1714435 Delay-Independent Closed-Loop Stabilization of Neutral System with Infinite Delays
Authors: Iyai Davies, Olivier L. C. Haas
Abstract:
In this paper, the problem of stability and stabilization for neutral delay-differential systems with infinite delay is investigated. Using Lyapunov method, new delay-independent sufficient condition for the stability of neutral systems with infinite delay is obtained in terms of linear matrix inequality (LMI). Memory-less state feedback controllers are then designed for the stabilization of the system using the feasible solution of the resulting LMI, which are easily solved using any optimization algorithms. Numerical examples are given to illustrate the results of the proposed methods.Keywords: infinite delays, Lyapunov method, linear matrix inequality, neutral systems, stability
Procedia PDF Downloads 4314434 Integrated Watershed Management Practice in Chelchai Hyrcanian Forests in the North of Iran
Authors: Mashad Maramaei, Behrooz Chogan, Reza Ahmadi
Abstract:
Human health and the health of his watershed are inseparable. This is because a watershed is an interconnected system of "land", "water", "air" and "life". Nowadays, most of the world's watersheds show symptoms of unhealthiness and require a prompt solution. It is believed that suitable solution is a participatory and Integrated Watershed Management (IWM). In recent decades the Hyrcanian forests in the north of Iran, which belongs to the end of the third geological era, are suffering from many environmental challenges such as land degradation, increasing trends of flood, drought and accelerated soil erosion. These challenges in the main forested area of the country impose many tangible and intangible damages and human losses. This is despite the fact that in the past decades, forestry programs, watershed management and other activities in the region have been implemented in a parallel and uncoordinated manner. Therefore, recently; the Natural Resources and Watershed Management Organization has resorted to the concept of IWM planning the Hyrcanian watersheds. The Chelchai watershed as mostly degraded watershed in the eastern part of the Hyrcanian forests has been selected as a pilot watershed for implementation of the IWM. It has a drainage area of 25680 hectares and receives an average annual precipitation of 650 mm. In this mountainous region, the average temperature is 17.3 degrees Celsius. About 34% of the watershed is under cultivation, 64% under forest cover, 2% under built up areas and etc. In this research, the effectiveness or ineffectiveness of the IWM model implementation of the Natural Resources and Watershed Management Organization has been evaluated based on questionnaire method and field studies. The results indicated that IWM activities in the study area should be reconsidered and revived. Based on this research and the lessons learned during five years' experience in the Chelchai watershed; authors believe that seven important tasks are necessary for socially acceptable and successful implementation of IWM projects. These are: 1) Establishment of Local Coordination Committee (LCC) at the watershed level 2) working for development of a IWM law among government organizations to organize watershed management and eliminate parallel and contradictory activities 3) More investment on education of local communities, especially women and children 4) Development of trust builder and pattern projects that showing best agricultural and livestock management activities at each of 26 villages 5) Assigning forest protection to local communities. 6) Capacity building of government stakeholders. 7) Helping in the marketing of watershed products.Keywords: integrated watershed management, Chelchai, Hyrcanian forests, Iran
Procedia PDF Downloads 224433 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model
Authors: Donatella Giuliani
Abstract:
In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation
Procedia PDF Downloads 2174432 Establishment and Validation of Correlation Equations to Estimate Volumetric Oxygen Mass Transfer Coefficient (KLa) from Process Parameters in Stirred-Tank Bioreactors Using Response Surface Methodology
Authors: Jantakan Jullawateelert, Korakod Haonoo, Sutipong Sananseang, Sarun Torpaiboon, Thanunthon Bowornsakulwong, Lalintip Hocharoen
Abstract:
Process scale-up is essential for the biological process to increase production capacity from bench-scale bioreactors to either pilot or commercial production. Scale-up based on constant volumetric oxygen mass transfer coefficient (KLa) is mostly used as a scale-up factor since oxygen supply is one of the key limiting factors for cell growth. However, to estimate KLa of culture vessels operated with different conditions are time-consuming since it is considerably influenced by a lot of factors. To overcome the issue, this study aimed to establish correlation equations of KLa and operating parameters in 0.5 L and 5 L bioreactor employed with pitched-blade impeller and gas sparger. Temperature, gas flow rate, agitation speed, and impeller position were selected as process parameters and equations were created using response surface methodology (RSM) based on central composite design (CCD). In addition, the effects of these parameters on KLa were also investigated. Based on RSM, second-order polynomial models for 0.5 L and 5 L bioreactor were obtained with an acceptable determination coefficient (R²) as 0.9736 and 0.9190, respectively. These models were validated, and experimental values showed differences less than 10% from the predicted values. Moreover, RSM revealed that gas flow rate is the most significant parameter while temperature and agitation speed were also found to greatly affect the KLa in both bioreactors. Nevertheless, impeller position was shown to influence KLa in only 5L system. To sum up, these modeled correlations can be used to accurately predict KLa within the specified range of process parameters of two different sizes of bioreactors for further scale-up application.Keywords: response surface methodology, scale-up, stirred-tank bioreactor, volumetric oxygen mass transfer coefficient
Procedia PDF Downloads 2074431 Potential Effects of Climate Change on Streamflow, Based on the Occurrence of Severe Floods in Kelantan, East Coasts of Peninsular Malaysia River Basin
Authors: Muhd. Barzani Gasim, Mohd. Ekhwan Toriman, Mohd. Khairul Amri Kamarudin, Azman Azid, Siti Humaira Haron, Muhammad Hafiz Md. Saad
Abstract:
Malaysia is a country in Southeast Asia that constantly exposed to flooding and landslide. The disaster has caused some troubles such loss of property, loss of life and discomfort of people involved. This problem occurs as a result of climate change leading to increased stream flow rate as a result of disruption to regional hydrological cycles. The aim of the study is to determine hydrologic processes in the east coasts of Peninsular Malaysia, especially in Kelantan Basin. Parameterized to account for the spatial and temporal variability of basin characteristics and their responses to climate variability. For hydrological modeling of the basin, the Soil and Water Assessment Tool (SWAT) model such as relief, soil type, and its use, and historical daily time series of climate and river flow rates are studied. The interpretation of Landsat map/land uses will be applied in this study. The combined of SWAT and climate models, the system will be predicted an increase in future scenario climate precipitation, increase in surface runoff, increase in recharge and increase in the total water yield. As a result, this model has successfully developed the basin analysis by demonstrating analyzing hydrographs visually, good estimates of minimum and maximum flows and severe floods observed during calibration and validation periods.Keywords: east coasts of Peninsular Malaysia, Kelantan river basin, minimum and maximum flows, severe floods, SWAT model
Procedia PDF Downloads 2624430 Synthesis Using Sintering and Characterisation of FeCrCoNiZn Alloy Using SEM and Nanoindentation
Authors: Steadyman Chikumba, Vasudeva Vereedhi Rao
Abstract:
This paper reports on the synthesis of FeCrCoNiZn and its characterisation using SEM and nanoindentation. The high entropy alloy FeCrCoNiZn was fabricated using spark plasma sintering at a temperature of 1100ᵒC from powders mixed for 9 hours. The powders mixture was equimolar, and the resultant microstructure had a single crystalline structure when studied under SEM. Several nano Vickers hardness measurements were taken on a polished surface etched by Nital solution. The hardness ranged from 711 Vickers to a maximum of 1773.2. The alloy FeCrCoNiZn showed a nano hardness of 1070 Vickers and a modulus of elasticity of 460.4 MPa. The process managed to fabricate a very hard material that can find applications where wear resistance is desired.Keywords: high entropy alloy, FeCrVNiZn, nanohardness, SEM
Procedia PDF Downloads 1004429 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data
Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos
Abstract:
Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia
Procedia PDF Downloads 224428 DMBR-Net: Deep Multiple-Resolution Bilateral Networks for Real-Time and Accurate Semantic Segmentation
Authors: Pengfei Meng, Shuangcheng Jia, Qian Li
Abstract:
We proposed a real-time high-precision semantic segmentation network based on a multi-resolution feature fusion module, the auxiliary feature extracting module, upsampling module, and atrous spatial pyramid pooling (ASPP) module. We designed a feature fusion structure, which is integrated with sufficient features of different resolutions. We also studied the effect of side-branch structure on the network and made discoveries. Based on the discoveries about the side-branch of the network structure, we used a side-branch auxiliary feature extraction layer in the network to improve the effectiveness of the network. We also designed upsampling module, which has better results than the original upsampling module. In addition, we also re-considered the locations and number of atrous spatial pyramid pooling (ASPP) modules and modified the network structure according to the experimental results to further improve the effectiveness of the network. The network presented in this paper takes the backbone network of Bisenetv2 as a basic network, based on which we constructed a network structure on which we made improvements. We named this network deep multiple-resolution bilateral networks for real-time, referred to as DMBR-Net. After experimental testing, our proposed DMBR-Net network achieved 81.2% mIoU at 119FPS on the Cityscapes validation dataset, 80.7% mIoU at 109FPS on the CamVid test dataset, 29.9% mIoU at 78FPS on the COCOStuff test dataset. Compared with all lightweight real-time semantic segmentation networks, our network achieves the highest accuracy at an appropriate speed.Keywords: multi-resolution feature fusion, atrous convolutional, bilateral networks, pyramid pooling
Procedia PDF Downloads 1504427 Toughness Factor of Polypropylene Fiber Reinforced Concrete in Aggressive Environment
Authors: R. E. Vasconcelos, K. R. M. da Silva, J. M. B. Pinto
Abstract:
This study aims to determine and to present the results of an experimental study of Synthetic (polypropylene) Fibers Reinforced Concrete (SFRC), in levels of 0.33% - 3kg/m3, 0.50% - 4.5kg/m3, and 0.66% - 6kg/m3, using cement CP V – ARI, at ages 28 and 88 days after specimens molding. The specimens were exposed for 60 days in aggressive environment (in solution of water and 3% of sodium chloride), after 28 days. The bending toughness tests were performed in prismatic specimens of 150 x 150 x 500 mm. The toughness factor values of the specimens in aggressive environment were the same to those obtained in normal environment (in air).Keywords: concrete reinforced with polypropylene fibers, toughness in bending, synthetic fibers, concrete reinforced
Procedia PDF Downloads 3444426 SAFECARE: Integrated Cyber-Physical Security Solution for Healthcare Critical Infrastructure
Authors: Francesco Lubrano, Fabrizio Bertone, Federico Stirano
Abstract:
Modern societies strongly depend on Critical Infrastructures (CI). Hospitals, power supplies, water supplies, telecommunications are just few examples of CIs that provide vital functions to societies. CIs like hospitals are very complex environments, characterized by a huge number of cyber and physical systems that are becoming increasingly integrated. Ensuring a high level of security within such critical infrastructure requires a deep knowledge of vulnerabilities, threats, and potential attacks that may occur, as well as defence and prevention or mitigation strategies. The possibility to remotely monitor and control almost everything is pushing the adoption of network-connected devices. This implicitly introduces new threats and potential vulnerabilities, posing a risk, especially to those devices connected to the Internet. Modern medical devices used in hospitals are not an exception and are more and more being connected to enhance their functionalities and easing the management. Moreover, hospitals are environments with high flows of people, that are difficult to monitor and can somehow easily have access to the same places used by the staff, potentially creating damages. It is therefore clear that physical and cyber threats should be considered, analysed, and treated together as cyber-physical threats. This means that an integrated approach is required. SAFECARE, an integrated cyber-physical security solution, tries to respond to the presented issues within healthcare infrastructures. The challenge is to bring together the most advanced technologies from the physical and cyber security spheres, to achieve a global optimum for systemic security and for the management of combined cyber and physical threats and incidents and their interconnections. Moreover, potential impacts and cascading effects are evaluated through impact propagation models that rely on modular ontologies and a rule-based engine. Indeed, SAFECARE architecture foresees i) a macroblock related to cyber security field, where innovative tools are deployed to monitor network traffic, systems and medical devices; ii) a physical security macroblock, where video management systems are coupled with access control management, building management systems and innovative AI algorithms to detect behavior anomalies; iii) an integration system that collects all the incoming incidents, simulating their potential cascading effects, providing alerts and updated information regarding assets availability.Keywords: cyber security, defence strategies, impact propagation, integrated security, physical security
Procedia PDF Downloads 1654425 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 654424 Roullete Wheel Selection Mechanism for Solving Travelling Salesman Problem in Ant Colony Optimization
Authors: Sourabh Joshi, Geetinder Kaur, Sarabjit Kaur, Gulwatanpreet Singh, Geetika Mannan
Abstract:
In this paper, we have use an algorithm that able to obtain an optimal solution to travelling salesman problem from a huge search space, quickly. This algorithm is based upon the ant colony optimization technique and employees roulette wheel selection mechanism. To illustrate it more clearly, a program has been implemented which is based upon this algorithm, that presents the changing process of route iteration in a more intuitive way. In the event, we had find the optimal path between hundred cities and also calculate the distance between two cities.Keywords: ant colony, optimization, travelling salesman problem, roulette wheel selection
Procedia PDF Downloads 4414423 Ground Surface Temperature History Prediction Using Long-Short Term Memory Neural Network Architecture
Authors: Venkat S. Somayajula
Abstract:
Ground surface temperature history prediction model plays a vital role in determining standards for international nuclear waste management. International standards for borehole based nuclear waste disposal require paleoclimate cycle predictions on scale of a million forward years for the place of waste disposal. This research focuses on developing a paleoclimate cycle prediction model using Bayesian long-short term memory (LSTM) neural architecture operated on accumulated borehole temperature history data. Bayesian models have been previously used for paleoclimate cycle prediction based on Monte-Carlo weight method, but due to limitations pertaining model coupling with certain other prediction networks, Bayesian models in past couldn’t accommodate prediction cycle’s over 1000 years. LSTM has provided frontier to couple developed models with other prediction networks with ease. Paleoclimate cycle developed using this process will be trained on existing borehole data and then will be coupled to surface temperature history prediction networks which give endpoints for backpropagation of LSTM network and optimize the cycle of prediction for larger prediction time scales. Trained LSTM will be tested on past data for validation and then propagated for forward prediction of temperatures at borehole locations. This research will be beneficial for study pertaining to nuclear waste management, anthropological cycle predictions and geophysical featuresKeywords: Bayesian long-short term memory neural network, borehole temperature, ground surface temperature history, paleoclimate cycle
Procedia PDF Downloads 1284422 Solving Stochastic Eigenvalue Problem of Wick Type
Authors: Hassan Manouzi, Taous-Meriem Laleg-Kirati
Abstract:
In this paper we study mathematically the eigenvalue problem for stochastic elliptic partial differential equation of Wick type. Using the Wick-product and the Wiener-Ito chaos expansion, the stochastic eigenvalue problem is reformulated as a system of an eigenvalue problem for a deterministic partial differential equation and elliptic partial differential equations by using the Fredholm alternative. To reduce the computational complexity of this system, we shall use a decomposition-coordination method. Once this approximation is performed, the statistics of the numerical solution can be easily evaluated.Keywords: eigenvalue problem, Wick product, SPDEs, finite element, Wiener-Ito chaos expansion
Procedia PDF Downloads 3594421 A User-Directed Approach to Optimization via Metaprogramming
Authors: Eashan Hatti
Abstract:
In software development, programmers often must make a choice between high-level programming and high-performance programs. High-level programming encourages the use of complex, pervasive abstractions. However, the use of these abstractions degrades performance-high performance demands that programs be low-level. In a compiler, the optimizer attempts to let the user have both. The optimizer takes high-level, abstract code as an input and produces low-level, performant code as an output. However, there is a problem with having the optimizer be a built-in part of the compiler. Domain-specific abstractions implemented as libraries are common in high-level languages. As a language’s library ecosystem grows, so does the number of abstractions that programmers will use. If these abstractions are to be performant, the optimizer must be extended with new optimizations to target them, or these abstractions must rely on existing general-purpose optimizations. The latter is often not as effective as needed. The former presents too significant of an effort for the compiler developers, as they are the only ones who can extend the language with new optimizations. Thus, the language becomes more high-level, yet the optimizer – and, in turn, program performance – falls behind. Programmers are again confronted with a choice between high-level programming and high-performance programs. To investigate a potential solution to this problem, we developed Peridot, a prototype programming language. Peridot’s main contribution is that it enables library developers to easily extend the language with new optimizations themselves. This allows the optimization workload to be taken off the compiler developers’ hands and given to a much larger set of people who can specialize in each problem domain. Because of this, optimizations can be much more effective while also being much more numerous. To enable this, Peridot supports metaprogramming designed for implementing program transformations. The language is split into two fragments or “levels”, one for metaprogramming, the other for high-level general-purpose programming. The metaprogramming level supports logic programming. Peridot’s key idea is that optimizations are simply implemented as metaprograms. The meta level supports several specific features which make it particularly suited to implementing optimizers. For instance, metaprograms can automatically deduce equalities between the programs they are optimizing via unification, deal with variable binding declaratively via higher-order abstract syntax, and avoid the phase-ordering problem via non-determinism. We have found that this design centered around logic programming makes optimizers concise and easy to write compared to their equivalents in functional or imperative languages. Overall, implementing Peridot has shown that its design is a viable solution to the problem of writing code which is both high-level and performant.Keywords: optimization, metaprogramming, logic programming, abstraction
Procedia PDF Downloads 884420 On-Line Super Critical Fluid Extraction, Supercritical Fluid Chromatography, Mass Spectrometry, a Technique in Pharmaceutical Analysis
Authors: Narayana Murthy Akurathi, Vijaya Lakshmi Marella
Abstract:
The literature is reviewed with regard to online Super critical fluid extraction (SFE) coupled directly with supercritical fluid chromatography (SFC) -mass spectrometry that have typically more sensitive than conventional LC-MS/MS and GC-MS/MS. It is becoming increasingly interesting to use on-line techniques that combine sample preparation, separation and detection in one analytical set up. This provides less human intervention, uses small amount of sample and organic solvent and yields enhanced analyte enrichment in a shorter time. The sample extraction is performed under light shielding and anaerobic conditions, preventing the degradation of thermo labile analytes. It may be able to analyze compounds over a wide polarity range as SFC generally uses carbon dioxide which was collected as a by-product of other chemical reactions or is collected from the atmosphere as it contributes no new chemicals to the environment. The diffusion of solutes in supercritical fluids is about ten times greater than that in liquids and about three times less than in gases which results in a decrease in resistance to mass transfer in the column and allows for fast high resolution separations. The drawback of SFC when using carbon dioxide as mobile phase is that the direct introduction of water samples poses a series of problems, water must therefore be eliminated before it reaches the analytical column. Hundreds of compounds analysed simultaneously by simple enclosing in an extraction vessel. This is mainly applicable for pharmaceutical industry where it can analyse fatty acids and phospholipids that have many analogues as their UV spectrum is very similar, trace additives in polymers, cleaning validation can be conducted by putting swab sample in an extraction vessel, analysing hundreds of pesticides with good resolution.Keywords: super critical fluid extraction (SFE), super critical fluid chromatography (SFC), LCMS/MS, GCMS/MS
Procedia PDF Downloads 3914419 Radical Scavenging Activity of Protein Extracts from Pulse and Oleaginous Seeds
Authors: Silvia Gastaldello, Maria Grillo, Luca Tassoni, Claudio Maran, Stefano Balbo
Abstract:
Antioxidants are nowadays attractive not only for the countless benefits to the human and animal health, but also for the perspective of use as food preservative instead of synthetic chemical molecules. In this study, the radical scavenging activity of six protein extracts from pulse and oleaginous seeds was evaluated. The selected matrices are Pisum sativum (yellow pea from two different origins), Carthamus tinctorius (safflower), Helianthus annuus (sunflower), Lupinus luteus cv Mister (lupin) and Glycine max (soybean), since they are economically interesting for both human and animal nutrition. The seeds were grinded and proteins extracted from 20mg powder with a specific vegetal-extraction kit. Proteins have been quantified through Bradford protocol and scavenging activity was revealed using DPPH assay, based on radical DPPH (2,2-diphenyl-1-picrylhydrazyl) absorbance decrease in the presence of antioxidants molecules. Different concentrations of the protein extract (1, 5, 10, 50, 100, 500 µg/ml) were mixed with DPPH solution (DPPH 0,004% in ethanol 70% v/v). Ascorbic acid was used as a scavenging activity standard reference, at the same six concentrations of protein extracts, while DPPH solution was used as control. Samples and standard were prepared in triplicate and incubated for 30 minutes in dark at room temperature, the absorbance was read at 517nm (ABS30). Average and standard deviation of absorbance values were calculated for each concentration of samples and standard. Statistical analysis using t-students and p-value were performed to assess the statistical significance of the scavenging activity difference between the samples (or standard) and control (ABSctrl). The percentage of antioxidant activity has been calculated using the formula [(ABSctrl-ABS30)/ABSctrl]*100. The obtained results demonstrate that all matrices showed antioxidant activity. Ascorbic acid, used as standard, exhibits a 96% scavenging activity at the concentration of 500 µg/ml. At the same conditions, sunflower, safflower and yellow peas revealed the highest antioxidant performance among the matrices analyzed, with an activity of 74%, 68% and 70% respectively (p < 0.005). Although lupin and soybean exhibit a lower antioxidant activity compared to the other matrices, they showed a percentage of 46 and 36 respectively. All these data suggest the possibility to use undervalued edible matrices as antioxidants source. However, further studies are necessary to investigate a possible synergic effect of several matrices as well as the impact of industrial processes for a large-scale approach.Keywords: antioxidants, DPPH assay, natural matrices, vegetal proteins
Procedia PDF Downloads 4334418 Studies on Effect of Nano Size and Surface Coating on Enhancement of Bioavailability and Toxicity of Berberine Chloride; A p-gp Substrate
Authors: Sanjay Singh, Parameswara Rao Vuddanda
Abstract:
The aim of the present study is study the factual benefit of nano size and surface coating of p-gp efflux inhibitor on enhancement of bioavailability of Berberine chloride (BBR); a p-gp substrate. In addition, 28 days sub acute oral toxicity study was also conducted to assess the toxicity of the formulation on chronic administration. BBR loaded polymeric nanoparticles (BBR-NP) were prepared by nanoprecipitation method. BBR NP were surface coated (BBR-SCNP) with the 1 % w/v of vitamin E TPGS. For bioavailability study, total five groups (n=6) of rat were treated as follows first; pure BBR, second; physical mixture of BBR, carrier and vitamin E TPGS, third; BBR-NP, fourth; BBR-SCNP and fifth; BBR and verapamil (widely used p-gp inhibitor). Blood was withdrawn at pre-set timing points in 24 hrs study and drug was quantified by HPLC method. In oral chronic toxicity study, total four groups (n=6) were treated as follows first (control); water, second; pure BBR, third; BBR surface coated nanoparticles and fourth; placebo BBR surface coated nanoparticles. Biochemical levels of liver (AST, ALP and ALT) and kidney (serum urea and creatinine) along with their histopathological studies were also examined (0-28 days). The AUC of BBR-SCNP was significantly 3.5 folds higher compared to all other groups. The AUC of BBR-NP was 3.23 and 1.52 folds higher compared to BBR solution and BBR with verapamil group, respectively. The physical mixture treated group showed slightly higher AUC than BBR solution treated group but significantly low compared to other groups. It indicates that encapsulation of BBR in nanosize form can circumvent P-gp efflux effect. BBR-NP showed pharmacokinetic parameters (Cmax and AUC) which are near to BBR-SCNP. However, the difference in values of T1/2 and clearance indicate that surface coating with vitamin E TPGS not only avoids the P-gp efflux at its absorption site (intestine) but also at organs which are responsible for metabolism and excretion (kidney and liver). It may be the reason for observed decrease in clearance of BBR-SCNP. No toxicity signs were observed either in biochemical or histopathological examination of liver and kidney during toxicity studies. The results indicate that administration of BBR in surface coated nanoformulation would be beneficial for enhancement of its bioavailability and longer retention in systemic circulation. Further, sub acute oral dose toxicity studies for 28 days such as evaluation of intestine, liver and kidney histopathology and biochemical estimations indicated that BBR-SCNP developed were safe for long use.Keywords: bioavailability, berberine nanoparticles, p-gp efflux inhibitor, nanoprecipitation method
Procedia PDF Downloads 3904417 Better Knowledge and Understanding of the Behavior of Masonry Buildings in Earthquake
Authors: A. R. Mirzaee, M. Khajehpour
Abstract:
Due to Simple Design, reasonable cost and easy implementation most people are reluctant to build buildings with masonry construction. Masonry Structures performance at earthquake are so limited. Of most earthquakes occurred in Iran and other countries, we can easily see that most of the damages are for masonry constructions and this is the evidence that we lack proper understanding of the performance of masonry buildings in earthquake. In this paper, according to field studies, conducted in past earthquakes. To evaluate the strengths and weaknesses points of the masonry constructions and also provide a solution to prevent such damage should be presented, and also program Examples of the correct bearing wall and tie-column design with the valid regulations (MSJC-08 (ASD)) will be explained.Keywords: Masonry constructions, performance at earthquake, MSJC-08 (ASD), bearing wall, tie-column
Procedia PDF Downloads 431