Search results for: artificial potential function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17303

Search results for: artificial potential function

15113 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors

Authors: Rajesh Singh, Kailash Kale

Abstract:

In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.

Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)

Procedia PDF Downloads 389
15112 Evaluating the Latest Advances in Dry Powder Inhaler Technology

Authors: Leila Asadollahi

Abstract:

Dry powder inhalers (DPIs) have come a long way since their creation, starting with the Spinhaler Fisons in 1967. For optimal performance, it is important to consider the interplay between formulation, device, and patient. DPIs have shown great potential in treating systemic disorders, as evidenced by their success in clinical practices. Ongoing clinical trials and market availability of DPI products for systemic disease treatment are also examined. Furthermore, the current COVID-19 pandemic has sparked increased interest in dry powder inhalation as a potential avenue for vaccines and antiviral drugs, prompting further exploration of its applications. To achieve optimal treatment outcomes for respiratory diseases, a thorough understanding of the various types of DPIs currently available is crucial. These include single-dose, multiple-unit dose, and multi-dose DPIs. This informative article delves into the administration of drugs via inhalation, examining its diverse routes of administration. Additionally, it illuminates the exciting advancements in inhalation delivery systems and investigates the latest therapeutic approaches for the treatment of respiratory ailments. Additionally, the article discusses the historical development of DPIs and the need for improved designs to enhance efficacy and patient adherence. The potential of DPIs in treating systemic diseases is also examined. Overall, this review provides valuable insights into the advancements, challenges, and future prospects of inhalation drug delivery systems, highlighting the potential they hold for respiratory and systemic disorders. The review aims to provide valuable insights into the advancements, challenges, and future prospects of inhalation drug delivery systems, highlighting the potential they hold for respiratory and systemic disorders.

Keywords: dry powder inhalers (DPIs), respiratory diseases, systemic disorders, pulmonary drug delivery

Procedia PDF Downloads 70
15111 Topology Optimization of the Interior Structures of Beams under Various Load and Support Conditions with Solid Isotropic Material with Penalization Method

Authors: Omer Oral, Y. Emre Yilmaz

Abstract:

Topology optimization is an approach that optimizes material distribution within a given design space for a certain load and boundary conditions by providing performance goals. It uses various restrictions such as boundary conditions, set of loads, and constraints to maximize the performance of the system. It is different than size and shape optimization methods, but it reserves some features of both methods. In this study, interior structures of the parts were optimized by using SIMP (Solid Isotropic Material with Penalization) method. The volume of the part was preassigned parameter and minimum deflection was the objective function. The basic idea behind the theory was considered, and different methods were discussed. Rhinoceros 3D design tool was used with Grasshopper and TopOpt plugins to create and optimize parts. A Grasshopper algorithm was designed and tested for different beams, set of arbitrary located forces and support types such as pinned, fixed, etc. Finally, 2.5D shapes were obtained and verified by observing the changes in density function.

Keywords: Grasshopper, lattice structure, microstructures, Rhinoceros, solid isotropic material with penalization method, TopOpt, topology optimization

Procedia PDF Downloads 136
15110 Ab Initio Study of Co2ZrGe and Co2NbB Full Heusler Compounds

Authors: A. Abada, S. Hiadsi, T. Ouahrani, B. Amrani, K. Amara

Abstract:

Using the first-principles full-potential linearized augmented plane wave plus local orbital (FP-LAPW+lo) method based on density functional theory (DFT), we have investigated the electronic structure and magnetism of some Co2- based full Heusler alloys, namely Co2ZrGe and Co2NbB. The calculations show that these compounds are to be half-metallic ferromagnets (HMFs) with a total magnetic moment of 2.000 µB per formula unit, well consistent with the Slater-Pauling rule. Our calculations show indirect band gaps of 0.58 eV and 0.47 eV in the minority spin channel of density of states (DOS) for Co2ZrGe and Co2NbB, respectively. Analysis of the DOS and magnetic moments indicates that their magnetism is mainly related to the d-d hybridization between the Co and Zr (or Nb) atoms. The half metallicity is found to be robust against volume changes and the two alloys kept a 100% of spin polarization at the Fermi level. In addition, an atom inside molecule AIM formalism and an electron localization function ELF were also adopted to study the bonding properties of these compounds, building a bridge between their electronic and bonding behavior. As they have a good crystallographic compatibility with the lattice of semiconductors used industrially and negative calculated cohesive energies with considerable absolute values these two alloys could be promising magnetic materials in the spintronics field.

Keywords: half-metallic ferromagnets, full Heusler alloys, magnetic properties, electronic properties

Procedia PDF Downloads 413
15109 Investigation into the Optimum Hydraulic Loading Rate for Selected Filter Media Packed in a Continuous Upflow Filter

Authors: A. Alzeyadi, E. Loffill, R. Alkhaddar

Abstract:

Continuous upflow filters can combine the nutrient (nitrogen and phosphate) and suspended solid removal in one unit process. The contaminant removal could be achieved chemically or biologically; in both processes the filter removal efficiency depends on the interaction between the packed filter media and the influent. In this paper a residence time distribution (RTD) study was carried out to understand and compare the transfer behaviour of contaminants through a selected filter media packed in a laboratory-scale continuous up flow filter; the selected filter media are limestone and white dolomite. The experimental work was conducted by injecting a tracer (red drain dye tracer –RDD) into the filtration system and then measuring the tracer concentration at the outflow as a function of time; the tracer injection was applied at hydraulic loading rates (HLRs) (3.8 to 15.2 m h-1). The results were analysed according to the cumulative distribution function F(t) to estimate the residence time of the tracer molecules inside the filter media. The mean residence time (MRT) and variance σ2 are two moments of RTD that were calculated to compare the RTD characteristics of limestone with white dolomite. The results showed that the exit-age distribution of the tracer looks better at HLRs (3.8 to 7.6 m h-1) and (3.8 m h-1) for limestone and white dolomite respectively. At these HLRs the cumulative distribution function F(t) revealed that the residence time of the tracer inside the limestone was longer than in the white dolomite; whereas all the tracer took 8 minutes to leave the white dolomite at 3.8 m h-1. On the other hand, the same amount of the tracer took 10 minutes to leave the limestone at the same HLR. In conclusion, the determination of the optimal level of hydraulic loading rate, which achieved the better influent distribution over the filtration system, helps to identify the applicability of the material as filter media. Further work will be applied to examine the efficiency of the limestone and white dolomite for phosphate removal by pumping a phosphate solution into the filter at HLRs (3.8 to 7.6 m h-1).

Keywords: filter media, hydraulic loading rate, residence time distribution, tracer

Procedia PDF Downloads 277
15108 Oat βeta Glucan Attenuates the Development of Atherosclerosis and Improves the Intestinal Barrier Function by Reducing Bacterial Endotoxin Translocation in APOE-/- MICE

Authors: Dalal Alghawas, Jetty Lee, Kaisa Poutanen, Hani El-Nezami

Abstract:

Oat β-glucan a water soluble non starch linear polysaccharide has been approved as a cholesterol lowering agent by various food safety administrations and is commonly used to reduce the risk of heart disease. The molecular weight of oat β-glucan can vary depending on the extraction and fractionation methods. It is not clear whether the molecular weight has a significant impact at reducing the acceleration of atherosclerosis. The aim of this study was to investigate three different oat β-glucan fractionations on the development of atherosclerosis in vivo. With special focus on plaque stability and the intestinal barrier function. To test this, ApoE-/- female mice were fed a high fat diet supplemented with oat bran, high molecular weight (HMW) oat β-glucan fractionate and low molecular weight (LMW) oat β-glucan fractionate for 16 weeks. Atherosclerosis risk markers were measured in the plasma, heart and aortic tree. Plaque size was measured in the aortic root and aortic tree. ICAM-1, VCAM-1, E-Selectin, P-Selectin, protein levels were assessed from the aortic tree to determine plaque stability at 16 weeks. The expression of p22phox at the aortic root was evaluated to study the NADPH oxidase complex involved in nitric oxide bioavailability and vascular elasticity. The tight junction proteins E-cadherin and beta-catenin from western blot analyses were analysed as an intestinal barrier function test. Plasma LPS, intestinal D-lactate levels and hepatic FMO gene expression were carried out to confirm whether the compromised intestinal barrier lead to endotoxemia. The oat bran and HMW oat β-glucan diet groups were more effective than the LMW β-glucan diet group at reducing the plaque size and showed marked improvements in plaque stability. The intestinal barrier was compromised for all the experimental groups however the endotoxemia levels were higher in the LMW β-glucan diet group. The oat bran and HMW oat β-glucan diet groups were more effective at attenuating the development of atherosclerosis. Reasons for this could be due to the LMW oat β-glucan diet group’s low viscosity in the gut and the inability to block the reabsorption of cholesterol. Furthermore the low viscosity may allow more bacterial endotoxin translocation through the impaired intestinal barrier. In future food technologists should carefully consider how to incorporate LMW oat β-glucan as a health promoting food.

Keywords: Atherosclerosis, beta glucan, endotoxemia, intestinal barrier function

Procedia PDF Downloads 420
15107 Monitoring Public Attitudes Towards Tourism Valorisation of the Dinara Nature Park’s Dry Grasslands

Authors: Sven Ratković

Abstract:

The survey of public attitudes and knowledge was conducted as part of the Dinara back to LIFE project during June and July 2020. The aim of the research was to collect public opinions and knowledge on the topics of the biodiversity of Dinara, perception of tourist potential, sustainable development, and acceptance of the project. The research was conducted using the survey method in the cities of Sinj, Knin, Vrlika, and Trilj, and the municipalities of Hrvace, Otok, Kijevo, and Civljane, where a total of 404 people were surveyed. The respondents perceive the cultural and recreational potential of Dinara and recognize it as a potential for agriculture and tourism. According to respondents, the biological diversity of Dinara is most affected by fires and human activity. When it comes to nature protection, the majority of respondents don’t trust local self-government units and relevant ministries. The obtained results indicate the need for informing and educating the community, and they serve to adjust the project activities and better guide the touristic development of the project area. The examination will be repeated in the last project year (2023).

Keywords: protected area tourism, Dinara Nature Park, dry grasslands, touristic infrastructure

Procedia PDF Downloads 98
15106 Amplitude and Latency of P300 Component from Auditory Stimulus in Different Types of Personality: An Event Related Potential Study

Authors: Nasir Yusoff, Ahmad Adamu Adamu, Tahamina Begum, Faruque Reza

Abstract:

The P300 from Event related potential (ERP) explains the psycho-physiological phenomenon in human body. The present study aims to identify the differences of amplitude and latency of P300 component from auditory stimuli, between ambiversion and extraversion types of personality. Ambivert (N=20) and extravert (N=20) undergoing ERP recording at the Hospital Universiti Sains Malaysia (HUSM) laboratory. Electroencephalogram data was recorded with oddball paradigm, counting auditory standard and target tones, from nine electrode sites (Fz, Cz, Pz, T3, T4, T5, T6, P3 and P4) by using the 128 HydroCel Geodesic Sensor Net. The P300 latency of the target tones at all electrodes were insignificant. Similarly, the P300 latency of the standard tones were also insignificant except at Fz and T3 electrode. Likewise, the P300 amplitude of the target and standard tone in all electrode sites were insignificant. Extravert and ambivert indicate similar characteristic in cognition processing from auditory task.

Keywords: amplitude, event related potential, p300 component, latency

Procedia PDF Downloads 377
15105 Energy Recovery Potential from Food Waste and Yard Waste in New York and Montréal

Authors: T. Malmir, U. Eicker

Abstract:

Landfilling of organic waste is still the predominant waste management method in the USA and Canada. Strategic plans for waste diversion from landfills are needed to increase material recovery and energy generation from waste. In this paper, we carried out a statistical survey on waste flow in the two cities New York and Montréal and estimated the energy recovery potential for each case. Data collection and analysis of the organic waste (food waste, yard waste, etc.), paper and cardboard, metal, glass, plastic, carton, textile, electronic products and other materials were done based on the reports published by the Department of Sanitation in New York and Service de l'Environnement in Montréal. In order to calculate the gas generation potential of organic waste, Buswell equation was used in which the molar mass of the elements was calculated based on their atomic weight and the amount of organic waste in New York and Montréal. Also, the higher and lower calorific value of the organic waste (solid base) and biogas (gas base) were calculated. According to the results, only 19% (598 kt) and 45% (415 kt) of New York and Montréal waste were diverted from landfills in 2017, respectively. The biogas generation potential of the generated food waste and yard waste amounted to 631 million m3 in New York and 173 million m3 in Montréal. The higher and lower calorific value of food waste were 3482 and 2792 GWh in New York and 441 and 354 GWh in Montréal, respectively. In case of yard waste, they were 816 and 681 GWh in New York and 636 and 531 GWh in Montréal, respectively. Considering the higher calorific value, this amount would mean a contribution of around 2.5% energy in these cities.

Keywords: energy recovery, organic waste, urban energy modelling with INSEL, waste flow

Procedia PDF Downloads 137
15104 Executive Deficits in Non-Clinical Hoarders

Authors: Thomas Heffernan, Nick Neave, Colin Hamilton, Gill Case

Abstract:

Hoarding is the acquisition of and failure to discard possessions, leading to excessive clutter and significant psychological/emotional distress. From a cognitive-behavioural approach, excessive hoarding arises from information-processing deficits, as well as from problems with emotional attachment to possessions and beliefs about the nature of possessions. In terms of information processing, hoarders have shown deficits in executive functions, including working memory, planning, inhibitory control, and cognitive flexibility. However, this previous research is often confounded by co-morbid factors such as anxiety, depression, or obsessive-compulsive disorder. The current study adopted a cognitive-behavioural approach, specifically assessing executive deficits and working memory in a non-clinical sample of hoarders, compared with non-hoarders. In this study, a non-clinical sample of 40 hoarders and 73 non-hoarders (defined by The Savings Inventory-Revised) completed the Adult Executive Functioning Inventory, which measures working memory and inhibition, Dysexecutive Questionnaire-Revised, which measures general executive function and the Hospital Anxiety and Depression Scale, which measures mood. The participant sample was made up of unpaid young adult volunteers who were undergraduate students and who completed the questionnaires on a university campus. The results revealed that, after observing no differences between hoarders and non-hoarders on age, sex, and mood, hoarders reported significantly more deficits in inhibitory control and general executive function when compared with non-hoarders. There was no between-group difference on general working memory. This suggests that non-clinical hoarders have a specific difficulty with inhibition-control, which enables you to resist repeated, unwanted urges. This might explain the hoarder’s inability to resist urges to buy and keep items that are no longer of any practical use. These deficits may be underpinned by general executive function deficiencies.

Keywords: hoarding, memory, executive, deficits

Procedia PDF Downloads 193
15103 Electro-oxidation of Catechol in the Presence of Nicotinamide at Different pH

Authors: M. A. Motin, M. A. Aziz, M. Hafiz Mia, M. A. Hasem

Abstract:

The redox behavior of catechol in the presence of nicotinamide as nucleophiles has been studied in aqueous solution with various pH values and different concentration of nicotinamide using cyclic voltammetry and differential pulse voltammetry. Cyclic voltammetry of catechol in buffer solution (3.00 < pH < 9.00) shows one anodic and corresponding cathodic peak which relates to the transformation of catechol to corresponding o-benzoquinone and vice versa within a quasi reversible two electron transfer process. Cyclic voltammogram of catechol in the presence of nicotinamide in buffer solution of pH 7, show one anodic peak in the first cycle of potential and on the reverse scan the corresponding cathodic peak slowly decreases and new peak is observed at less positive potential. In the second cycle of potential a new anodic peak is observed at less positive potential. This indicates that nicotinamide attached with catechol and formed adduct after first cycle of oxidation. The effect of pH of catechol in presence of nicotinamide was studied by varying pH from 3 to 11. The substitution reaction of catechol with nicotimamide is facilitated at pH 7. In buffer solution of higher pH (>9), the CV shows different pattern. The effect of concentration of nicotinamide was studied by 2mM to 100 mM. The maximum substitution reaction has been found for 50 mM of nicotinamide and of pH 7. The proportionality of the first scan anodic and cathodic peak currents with square root of scan rate suggests that the peak current of the species at each redox reaction is controlled by diffusion process. The current functions (1/v-1/2) of the anodic peak decreased with the increasing of scan rate demonstrated that the behavior of the substitution reaction is of ECE type.

Keywords: redox interaction, catechol, nicotinamide, substituion reaction, pH effect

Procedia PDF Downloads 468
15102 MAGNI Dynamics: A Vision-Based Kinematic and Dynamic Upper-Limb Model for Intelligent Robotic Rehabilitation

Authors: Alexandros Lioulemes, Michail Theofanidis, Varun Kanal, Konstantinos Tsiakas, Maher Abujelala, Chris Collander, William B. Townsend, Angie Boisselle, Fillia Makedon

Abstract:

This paper presents a home-based robot-rehabilitation instrument, called ”MAGNI Dynamics”, that utilized a vision-based kinematic/dynamic module and an adaptive haptic feedback controller. The system is expected to provide personalized rehabilitation by adjusting its resistive and supportive behavior according to a fuzzy intelligence controller that acts as an inference system, which correlates the user’s performance to different stiffness factors. The vision module uses the Kinect’s skeletal tracking to monitor the user’s effort in an unobtrusive and safe way, by estimating the torque that affects the user’s arm. The system’s torque estimations are justified by capturing electromyographic data from primitive hand motions (Shoulder Abduction and Shoulder Forward Flexion). Moreover, we present and analyze how the Barrett WAM generates a force-field with a haptic controller to support or challenge the users. Experiments show that by shifting the proportional value, that corresponds to different stiffness factors of the haptic path, can potentially help the user to improve his/her motor skills. Finally, potential areas for future research are discussed, that address how a rehabilitation robotic framework may include multisensing data, to improve the user’s recovery process.

Keywords: human-robot interaction, kinect, kinematics, dynamics, haptic control, rehabilitation robotics, artificial intelligence

Procedia PDF Downloads 329
15101 Use of computer and peripherals in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar Mehrafarin, Reza Mehrafarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: archaeological surveys, computer use, iran, modern technologies, sistan

Procedia PDF Downloads 78
15100 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas

Authors: Sahithi Yarlagadda

Abstract:

The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.

Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm

Procedia PDF Downloads 110
15099 Role of Zinc Adminstration in Improvement of Faltering Growth in Egyption Children at Risk of Environmental Enteric Dysfunction

Authors: Ghada Mahmoud El Kassas, Maged Atta El Wakeel

Abstract:

Background: Environmental enteric dysfunction (EED) is impending trouble that flared up in the last decades to be pervasive in infants and children. EED is asymptomatic villous atrophy of the small bowel that is prevalent in the developing world and is associated with altered intestinal function and integrity. Evidence has suggested that supplementary zinc might ameliorate this damage by reducing gastrointestinal inflammation and may also benefit cognitive development. Objective: We tested whether zinc supplementation improves intestinal integrity, growth, and cognitive function in stunted children predicted to have EED. Methodology: This case–control prospective interventional study was conducted on 120 Egyptian Stunted children aged 1-10 years who recruited from the Nutrition clinic, the National research center, and 100 age and gender-matched healthy children as controls. At the primary phase of the study, Full history taking, clinical examination, and anthropometric measurements were done. Standard deviation score (SDS) for all measurements were calculated. Serum markers as Zonulin, Endotoxin core antibody (EndoCab), highly sensitive C-reactive protein (hsCRP), alpha1-acid glycoprotein (AGP), Tumor necrosis factor (TNF), and fecal markers such as myeloperoxidase (MPO), neopterin (NEO), and alpha-1-anti-trypsin (AAT) (as predictors of EED) were measured. Cognitive development was assessed (Bayley or Wechsler scores). Oral zinc at a dosage of 20 mg/d was supplemented to all cases and followed up for 6 months, after which the 2ry phase of the study included the previous clinical, laboratory, and cognitive assessment. Results: Serum and fecal inflammatory markers were significantly higher in cases compared to controls. Zonulin (P < 0.01), (EndoCab) (P < 0.001) and (AGP) (P < 0.03) markedly decreased in cases at the end of 2ry phase. Also (MPO), (NEO), and (AAT) showed a significant decline in cases at the end of the study (P < 0.001 for all). A significant increase in mid-upper arm circumference (MUAC) (P < 0.01), weight for age z-score, and skinfold thicknesses (P< 0.05 for both) was detected at end of the study, while height was not significantly affected. Cases also showed significant improvement of cognitive function at phase 2 of the study. Conclusion: Intestinal inflammatory state related to EED showed marked recovery after zinc supplementation. As a result, anthropometric and cognitive parameters showed obvious improvement with zinc supplementation.

Keywords: stunting, cognitive function, environmental enteric dysfunction, zinc

Procedia PDF Downloads 190
15098 Mechanical Properties of Ternary Metal Nitride Ti1-xTaxN Alloys from First-Principles

Authors: M. Benhamida, Kh. Bouamama, P. Djemia

Abstract:

We investigate by first-principles pseudo-potential calculations the composition dependence of lattice parameter, hardness and elastic properties of ternary disordered solid solutions Ti(1-x)Ta(x)N (1>=x>=0) with B1-rocksalt structure. Calculations use the coherent potential approximation with the exact muffin-tin orbitals (EMTO) and hardness formula for multicomponent covalent solid solution proposed. Bulk modulus B shows a nearly linear behaviour whereas not C44 and C’=(C11-C12)/2 that are not monotonous. Influences of vacancies on hardness of off-stoichiometric transition-metal nitrides TiN(1−x) and TaN(1−x) are also considered.

Keywords: transition metal nitride materials, elastic constants, hardness, EMTO

Procedia PDF Downloads 430
15097 Shuffled Structure for 4.225 GHz Antireflective Plates: A Proposal Proven by Numerical Simulation

Authors: Shin-Ku Lee, Ming-Tsu Ho

Abstract:

A newly proposed antireflective selector with shuffled structure is reported in this paper. The proposed idea is made of two different quarter wavelength (QW) slabs and numerically supported by the one-dimensional simulation results provided by the method of characteristics (MOC) to function as an antireflective selector. These two QW slabs are characterized by dielectric constants εᵣA and εᵣB, uniformly divided into N and N+1 pieces respectively which are then shuffled to form an antireflective plate with B(AB)N structure such that there is always one εᵣA piece between two εᵣB pieces. Another is A(BA)N structure where every εᵣB piece is sandwiched by two εᵣA pieces. Both proposed structures are numerically proved to function as QW plates. In order to allow maximum transmission through the proposed structures, the two dielectric constants are chosen to have the relation of (εᵣA)² = εᵣB > 1. The advantages of the proposed structures over the traditional anti-reflection coating techniques are two components with two thicknesses and to shuffle to form new QW structures. The design wavelength used to validate the proposed idea is 71 mm corresponding to a frequency about 4.225 GHz. The computational results are shown in both time and frequency domains revealing that the proposed structures produce minimum reflections around the frequency of interest.

Keywords: method of characteristics, quarter wavelength, anti-reflective plate, propagation of electromagnetic fields

Procedia PDF Downloads 146
15096 The Co-Simulation Interface SystemC/Matlab Applied in JPEG and SDR Application

Authors: Walid Hassairi, Moncef Bousselmi, Mohamed Abid

Abstract:

Functional verification is a major part of today’s system design task. Several approaches are available for verification on a high abstraction level, where designs are often modeled using MATLAB/Simulink. However, different approaches are a barrier to a unified verification flow. In this paper, we propose a co-simulation interface between SystemC and MATLAB and Simulink to enable functional verification of multi-abstraction levels designs. The resulting verification flow is tested on JPEG compression algorithm. The required synchronization of both simulation environments, as well as data type conversion is solved using the proposed co-simulation flow. We divided into two encoder jpeg parts. First implemented in SystemC which is the DCT is representing the HW part. Second, consisted of quantization and entropy encoding which is implemented in Matlab is the SW part. For communication and synchronization between these two parts we use S-Function and engine in Simulink matlab. With this research premise, this study introduces a new implementation of a Hardware SystemC of DCT. We compare the result of our simulation compared to SW / SW. We observe a reduction in simulation time you have 88.15% in JPEG and the design efficiency of the supply design is 90% in SDR.

Keywords: hardware/software, co-design, co-simulation, systemc, matlab, s-function, communication, synchronization

Procedia PDF Downloads 405
15095 Hepatoprotective Effect of Oleuropein against Cisplatin-Induced Liver Damage in Rat

Authors: Salim Cerig, Fatime Geyikoglu, Murat Bakir, Suat Colak, Merve Sonmez, Kubra Koc

Abstract:

Cisplatin (CIS) is one of the most effective an anticancer drug and also toxic to cells by activating oxidative stress. Oleuropein (OLE) has key role against oxidative stress in mammalian cells, but the role of this antioxidant in the toxicity of CIS remains unknown. The aim of the present study was to investigate the efficacy of OLE on CIS-induced liver damages in male rats. With this aim, male Sprague Dawley rats were randomly assigned to one of eight groups: Control group; the group treated with 7 mg/kg/day CIS; the groups treated with 50, 100 and 200 mg/kg/day OLE (i.p.); and the groups treated with OLE for three days starting at 24 h following CIS injection. After 4 days of injections, serum was provided to assess the blood AST, ALT and LDH values. The liver tissues were removed for histological, biochemical (TAC, TOS and MDA) and genotoxic evaluations. In the CIS treated group, the whole liver tissue showed significant histological changes. Also, CIS significantly increased both the incidence of oxidative stress and the induction of 8-hydroxy-deoxyguanosine (8-OH-dG). Moreover, the rats taking CIS have abnormal results on liver function tests. However, these parameters reached to the normal range after administration of OLE for 3 days. Finally, OLE demonstrated an acceptable high potential and was effective in attenuating CIS-induced liver injury. In this trial, the 200 mg/kg dose of OLE firstly appeared to induce the most optimal protective response.

Keywords: antioxidant response, cisplatin, histology, liver, oleuropein, 8-OhdG

Procedia PDF Downloads 340
15094 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 174
15093 AI-Driven Solutions for Optimizing Master Data Management

Authors: Srinivas Vangari

Abstract:

In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.

Keywords: artificial intelligence, master data management, data governance, data quality

Procedia PDF Downloads 17
15092 Characterization of the Corn Cob to Know Its Potential as a Source of Biosilica to Be Used in Sustainable Cementitious Mixtures

Authors: Sandra C. L. Dorea, Joann K. Whalen, Yixin Shao, Oumarou Savadogo

Abstract:

The major challenge for industries that rely on fossil fuels in manufacturing processes or to provide goods and services is to lower their CO2 emissions, as the case for the manufacture of Portland cement. Feasible materials for this purpose can include agro-industrial or agricultural wastes, which are termed 'biosilica' since the silica was contained in a biological matrix (biomass). Corn cob (CC) has some characteristics that make it a good candidate as biosilica source: 1) it is an abundant grain crop produced around the world; 2) more production means more available residues is left in the field to be used. This work aims to evaluate the CC collected from different farms in Canada during the corn harvest in order to see if they can be used together as a biosilica source. The characterization of the raw CC was made in the physical, chemical, and thermal way. The moisture content, the granulometry, and the morphology were also analyzed. The ash content measured was 2,1%. The Thermogravimetric Analysis (TGA) and its Derivative (DTG) evaluated of CC as a function of weight loss with temperature variation ranging between 30°C and 800°C in an atmosphere of N2. The chemical composition and the presence of silica revealed that the different sources of the CC do not interfere in its basic chemical composition, which means that this kind of waste can be used together as a source of biosilica no matter where they come from. Then, this biosilica can partially replace the cement Portland making sustainable cementitious mixtures and contributing to reduce the CO2 emissions.

Keywords: biosilica, characterization, corn cob, sustainable cementitious materials

Procedia PDF Downloads 262
15091 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 87
15090 Comparative Analysis of Islamic Bank in Indonesia and Malaysia with Risk Profile, Good Corporate Governance, Earnings, and Capital Method: Performance of Business Function and Social Function Perspective

Authors: Achsania Hendratmi, Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum

Abstract:

This study aims to compare and see the differences between Islamic bank in Indonesia and Islamic bank in Malaysia using RGEC method (Risk Profile, Good Corporate Governance, Earnings, and Capital). This study examines the comparison in business and social performance of eleven Islamic banks in Indonesia and fifteen Islamic banks in Malaysia. This research used quantitative approach and the collections of data was done by collecting all the annual reports of banks that has been created as a sample over the period 2011-2015. The test result of the Independent Samples T-test and Mann-Whitney Test showed there were differences in the business performance of Islamic Bank in Indonesia and Malaysia as seen from the aspect of Risk profile (FDR), GCG, and Earnings (ROA). Also, there were differences of business and social performance as seen from Earnings (ROE), Capital (CAR), and Sharia Conformity Indicator (PSR and ZR) aspects.

Keywords: business performance, Islamic banks, RGEC, social performance

Procedia PDF Downloads 294
15089 Local Spectrum Feature Extraction for Face Recognition

Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh

Abstract:

This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.

Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret

Procedia PDF Downloads 667
15088 Deepnic, A Method to Transform Each Variable into Image for Deep Learning

Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.

Abstract:

Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.

Keywords: tabular data, deep learning, perfect trees, NICS

Procedia PDF Downloads 90
15087 Biofuel Potential and Invasive Species Control: Exploring Prosopis Juliflora Pod Mash for Sustainable Energy Production

Authors: Mebrahtu Haile

Abstract:

Fuels obtained from renewable resources have garnered significant enthusiasm in recent decades due to concerns about fossil fuel depletion and climate change. This study aimed to investigate the potential of Prosopis juliflora pods mash for bio-ethanol production and its hydrolysis solid waste for solid fuel. Various parameters, such as acid concentration, hydrolysis times, fermentation times, fermentation temperature, and pH, were evaluated for their impact on bio-ethanol production using Saccharomyces cerevisiae yeast. The results showed that increasing acid concentration (up to 1 molar H₂SO₄) led to an increase in sugar content, reaching a maximum of 96.13%v/v. Optimal conditions for bio-ethanol production were found at 1 molar H₂SO₄ concentration (4.2%v/v), 48 hours fermentation time (5.1%v/v), 20 minutes hydrolysis time (5.57%v/v), 30°C fermentation temperature (5.57%v/v), and pH 5 (6.01%v/v), resulting in a maximum bio-ethanol yield of 6.01%v/v. The solid waste remaining after bio-ethanol production exhibited potential for use as a solid fuel, with a calorific value of 18.22 MJ/kg. These findings demonstrate the promising potential of Prosopis juliflora pods mash for bio-ethanol production and suggest a viable solution for addressing disposal challenges associated with solid waste, contributing to the exploration of renewable fuel sources in the face of fossil fuel depletion and climate change.

Keywords: prosopis juliflora, pods mash, invasive species, bio-ethanol, fermentation, Saccharomyces cerevisiae, solid fuel

Procedia PDF Downloads 33
15086 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities

Authors: Retius Chifurira

Abstract:

Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.

Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities

Procedia PDF Downloads 200
15085 TiO₂ Nanotube Array Based Selective Vapor Sensors for Breath Analysis

Authors: Arnab Hazra

Abstract:

Breath analysis is a quick, noninvasive and inexpensive technique for disease diagnosis can be used on people of all ages without any risk. Only a limited number of volatile organic compounds (VOCs) can be associated with the occurrence of specific diseases. These VOCs can be considered as disease markers or breath markers. Selective detection with specific concentration of breath marker in exhaled human breath is required to detect a particular disease. For example, acetone (C₃H₆O), ethanol (C₂H₅OH), ethane (C₂H₆) etc. are the breath markers and abnormal concentrations of these VOCs in exhaled human breath indicates the diseases like diabetes mellitus, renal failure, breast cancer respectively. Nanomaterial-based vapor sensors are inexpensive, small and potential candidate for the detection of breath markers. In practical measurement, selectivity is the most crucial issue where trace detection of breath marker is needed to identify accurately in the presence of several interfering vapors and gases. Current article concerns a novel technique for selective and lower ppb level detection of breath markers at very low temperature based on TiO₂ nanotube array based vapor sensor devices. Highly ordered and oriented TiO₂ nanotube array was synthesized by electrochemical anodization of high purity tatinium (Ti) foil. 0.5 wt% NH₄F, ethylene glycol and 10 vol% H₂O was used as the electrolyte and anodization was carried out for 90 min with 40 V DC potential. Au/TiO₂ Nanotube/Ti, sandwich type sensor device was fabricated for the selective detection of VOCs in low concentration range. Initially, sensor was characterized where resistive and capacitive change of the sensor was recorded within the valid concentration range for individual breath markers (or organic vapors). Sensor resistance was decreased and sensor capacitance was increased with the increase of vapor concentration. Now, the ratio of resistive slope (mR) and capacitive slope (mC) provided a concentration independent constant term (M) for a particular vapor. For the detection of unknown vapor, ratio of resistive change and capacitive change at any concentration was same to the previously calculated constant term (M). After successful identification of the target vapor, concentration was calculated from the straight line behavior of resistance as a function of concentration. Current technique is suitable for the detection of particular vapor from a mixture of other interfering vapors.

Keywords: breath marker, vapor sensors, selective detection, TiO₂ nanotube array

Procedia PDF Downloads 155
15084 A Fermatean Fuzzy MAIRCA Approach for Maintenance Strategy Selection of Process Plant Gearbox Using Sustainability Criteria

Authors: Soumava Boral, Sanjay K. Chaturvedi, Ian Howard, Kristoffer McKee, V. N. A. Naikan

Abstract:

Due to strict regulations from government to enhance the possibilities of sustainability practices in industries, and noting the advances in sustainable manufacturing practices, it is necessary that the associated processes are also sustainable. Maintenance of large scale and complex machines is a pivotal task to maintain the uninterrupted flow of manufacturing processes. Appropriate maintenance practices can prolong the lifetime of machines, and prevent associated breakdowns, which subsequently reduces different cost heads. Selection of the best maintenance strategies for such machines are considered as a burdensome task, as they require the consideration of multiple technical criteria, complex mathematical calculations, previous fault data, maintenance records, etc. In the era of the fourth industrial revolution, organizations are rapidly changing their way of business, and they are giving their utmost importance to sensor technologies, artificial intelligence, data analytics, automations, etc. In this work, the effectiveness of several maintenance strategies (e.g., preventive, failure-based, reliability centered, condition based, total productive maintenance, etc.) related to a large scale and complex gearbox, operating in a steel processing plant is evaluated in terms of economic, social, environmental and technical criteria. As it is not possible to obtain/describe some criteria by exact numerical values, these criteria are evaluated linguistically by cross-functional experts. Fuzzy sets are potential soft-computing technique, which has been useful to deal with linguistic data and to provide inferences in many complex situations. To prioritize different maintenance practices based on the identified sustainable criteria, multi-criteria decision making (MCDM) approaches can be considered as potential tools. Multi-Attributive Ideal Real Comparative Analysis (MAIRCA) is a recent addition in the MCDM family and has proven its superiority over some well-known MCDM approaches, like TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) and ELECTRE (ELimination Et Choix Traduisant la REalité). It has a simple but robust mathematical approach, which is easy to comprehend. On the other side, due to some inherent drawbacks of Intuitionistic Fuzzy Sets (IFS) and Pythagorean Fuzzy Sets (PFS), recently, the use of Fermatean Fuzzy Sets (FFSs) has been proposed. In this work, we propose the novel concept of FF-MAIRCA. We obtain the weights of the criteria by experts’ evaluation and use them to prioritize the different maintenance practices according to their suitability by FF-MAIRCA approach. Finally, a sensitivity analysis is carried out to highlight the robustness of the approach.

Keywords: Fermatean fuzzy sets, Fermatean fuzzy MAIRCA, maintenance strategy selection, sustainable manufacturing, MCDM

Procedia PDF Downloads 138