Search results for: input mode
1565 Determination of MDA by HPLC in Blood of Levofloxacin Treated Rats
Authors: D. S. Mohale, A. P. Dewani, A. S.tripathi, A. V. Chandewar
Abstract:
Present work demonstrates the applicability of high-performance liquid chromatography (HPLC) with UV-Vis detection for the quantification of malondialdehyde as malondialdehyde-thiobarbituric acid complex (MDA-TBA) in-vivo in rats. The HPLC method for MDA-TBA was achieved by isocratic mode on a reverse-phase C18 column (250mm×4.6mm) at a flow rate of 1.0mLmin−1 followed by detection at 532 nm. The chromatographic conditions were optimized by varying the concentration and pH of water followed by changes in percentage of organic phase optimal mobile phase consisted of mixture of water (0.2% triethylamine pH adjusted to 2.3 by ortho-phosphoric acid) and acetonitrile in ratio (80:20v/v). The retention time of MDA-TBA complex was 3.7 min. The developed method was sensitive as limit of detection and quantification (LOD and LOQ) for MDA-TBA complex were (standard deviation and slope of calibration curve) 110 ng/ml and 363 ng/ml respectively. Calibration studies were done by spiking MDA into rat plasma at concentrations ranging from 500 to 1000 ng/ml. The precision of developed method measured in terms of relative standard deviations for intra-day and inter-day studies was 1.6–5.0% and 1.9–3.6% respectively. The HPLC method was applied for monitoring MDA levels in rats subjected to chronic treatment of levofloxacin (LEV) (5mg/kg/day) for 21 days. Results were compared by findings in control group rats. Mean peak areas of both study groups was subjected for statistical treatment to unpaired student t-test to find p-values. The p value was <0.001 indicating significant results and suggesting increased MDA levels in rats subjected to chronic treatment of LEV of 21 days.Keywords: malondialdehyde-thiobarbituric acid complex, levofloxacin, HPLC, oxidative stress
Procedia PDF Downloads 3341564 Trend and Cuses of Decline in Trifoliate Yam (Dioscorea dumentorum) Production in Enugu State, Nigeria: Implication for Food Security and Biodiversity Conservation
Authors: J. C. Iwuchukwu, K. C. Okwor
Abstract:
In recent time and in the study area, yam farmers are moving into less laborious and more economical crops and very few yam farmers are growing trifoliate yam. In yam markets, little or no bitter yam is displayed or sold. The work was therefore designed to ascertain trend and causes of decline in trifoliate yam production in Enugu state. Three agricultural zones, six blocks, eighteen circles and one hundred and eight trifoliate yam farmers that were purposively selected constituted sample for the study. An interview schedule was used to collect data while percentage, mean score and standard deviation were used for data analysis. Findings of the study revealed that the respondents had no extension contact, Majority (90.7%) sourced information on trifoliate yam from neighbours/friends/relatives and produced mainly for consumption (67.6%) during rainy season (70.4%). Trifoliate yam was produced manually(71.3%) and organically (58.3%) in a mixture of other crops (87%) using indigenous/local varieties (73.1%). Mean size of land allocated to trifoliate yam production was relatively steady, mean cost of input and income were increasing while output was decreasing within the years under consideration (before 2001 to 2014). Poor/lack of finance(M=1.8) and drudgery associated with trifoliate yam product(M=1.72) were some of the causes of decline in trifoliate yam production in the area. The study recommended that more research and public enlightenment campaigns on the importance of trifoliate yam should be carried out to encourage and consolidate farmers and the masses effort in production and consumption of the crop so that it will not go extinct and then contribute to food security.Keywords: causes, decline, trend, trifoliate yam
Procedia PDF Downloads 4021563 Foresight in Food Supply System in Bogota
Authors: Suarez-Puello Alejandro, Baquero-Ruiz Andrés F, Suarez-Puello Rodrigo
Abstract:
This paper discusses the results of a foresight exercise which analyzes Bogota’s fruit, vegetable and tuber supply chain strategy- described at the Food Supply and Security Master Plan (FSSMP)-to provide the inhabitants of Bogotá, Colombia, with basic food products at a fair price. The methodology consisted of using quantitative and qualitative foresight tools such as system dynamics and variable selection methods to better represent interactions among stakeholders and obtain more integral results that could shed light on this complex situation. At first, the Master Plan is an input to establish the objectives and scope of the exercise. Then, stakeholders and their relationships are identified. Later, system dynamics is used to model product, information and money flow along the fruit, vegetable and tuber supply chain. Two scenarios are presented, discussing actions by the public sector and the reactions that could be expected from the whole food supply system. Finally, these impacts are compared to the Food Supply and Security Master Plan’s objectives suggesting recommendations that could improve its execution. This foresight exercise performed at a governmental level is intended to promote the widen the use of foresight as an anticipatory, decision-making tool that offers solutions to complex problems.Keywords: decision making, foresight, public policies, supply chain, system dynamics
Procedia PDF Downloads 4391562 Automated End-to-End Pipeline Processing Solution for Autonomous Driving
Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi
Abstract:
Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing
Procedia PDF Downloads 1231561 A.T.O.M.- Artificial Intelligent Omnipresent Machine
Authors: R. Kanthavel, R. Yogesh Kumar, T. Narendrakumar, B. Santhosh, S. Surya Prakash
Abstract:
This paper primarily focuses on developing an affordable personal assistant and the implementation of it in the field of Artificial Intelligence (AI) to create a virtual assistant/friend. The problem in existing home automation techniques is that it requires the usage of exact command words present in the database to execute the corresponding task. Our proposed work is ATOM a.k.a ‘Artificial intelligence Talking Omnipresent Machine’. Our inspiration came from an unlikely source- the movie ‘Iron Man’ in which a character called J.A.R.V.I.S has omnipresence, and device controlling capability. This device can control household devices in real time and send the live information to the user. This device does not require the user to utter the exact commands specified in the database as it can capture the keywords from the uttered commands, correlates the obtained keywords and perform the specified task. This ability to compare and correlate the keywords gives the user the liberty to give commands which are not necessarily the exact words provided in the database. The proposed work has a higher flexibility (due to its keyword extracting ability from the user input) comparing to the existing work Intelligent Home automation System (IHAS), is more accurate, and is much more affordable as it makes use of WI-FI module and raspberry pi 2 instead of ZigBee and a computer respectively.Keywords: home automation, speech recognition, voice control, personal assistant, artificial intelligence
Procedia PDF Downloads 3361560 1-g Shake Table Tests to Study the Impact of PGA on Foundation Settlement in Liquefiable Soil
Authors: Md. Kausar Alam, Mohammad Yazdi, Peiman Zogh, Ramin Motamed
Abstract:
The liquefaction-induced ground settlement has caused severe damage to structures in the past decades. However, the amount of building settlement caused by liquefaction is directly proportional to the intensity of the ground shaking. To reduce this soil liquefaction effect, it is essential to examine the influence of peak ground acceleration (PGA). Unfortunately, limited studies have been carried out on this issue. In this study, a series of moderate scale 1g shake table experiments were conducted at the University of Nevada Reno to evaluate the influence of PGA with the same duration in liquefiable soil layers. The model is prepared based on a large-scale shake table with a scaling factor of N = 5, which has been conducted at the University of California, San Diego. The model ground has three soil layers with relative densities of 50% for crust, 30% for liquefiable, and 90% for dense layer, respectively. In addition, a shallow foundation is seated over an unsaturated crust layer. After preparing the model, the input motions having various peak ground accelerations (i.e., 0.16g, 0.25g, and 0.37g) for the same duration (10 sec) were applied. Based on the experimental results, when the PGA increased from 0.16g to 0.37g, the foundation increased from 20 mm to 100 mm. In addition, the expected foundation settlement based on the scaling factor was 25 mm, while the actual settlement for PGA 0.25g for 10 seconds was 50 mm.Keywords: foundation settlement, liquefaction, peak ground acceleration, shake table test
Procedia PDF Downloads 771559 Study of Complex (CO) 3Ti (PHND) and CpV (PHND) (PHND = Phénanthridine)
Authors: Akila Tayeb-Benmachiche, Saber-Mustapha Zendaoui, Salah-Eddine Bouaoud, Bachir Zouchoune
Abstract:
The variation of the metal coordination site in π-coordinated polycyclic aromatic hydrocarbons (PAH) corresponds to the haptotropic rearrangement or haptotropic migration in which the metal fragment MLn is considered as the moveable moiety that is shifted between two rings of polycyclic or heteropolycyclic ligands. These structural characteristics and dynamical properties give to this category of transition metal complexes a considerable interest. We have investigated the coordination and the haptotropic shifts of (CO)3Ti and CpV moieties over the phenanthridine aromatic system and according to the metal atom nature. The optimization of (CO)3Ti(PHND) and CpV(PHND), using the Amsterdam Density Functional (ADF) program, without a symmetrical restriction of geometry gives an η6 coordination mode of the C6 and C5N rings, which in turn give rise to a six low-lying deficient 16-MVE of each (CO)3Ti(PHND) and CpV(PHND) structure (three singlet and three triplet state structures for Ti complexes and three triplet and three quintet state structures for V complexes). Thus, the η6–η6 haptotropic migration of the metal fragment MLn from the terminal C6 ring to the central C5N ring has been achieved by a loss of energy. However, its η6–η6 haptotropic migration from central C5N ring to the terminal C6 rings has been accomplished by a gain of energy. These results show the capability of the phenanthridine ligand to adapt itself to the electronic demand of the metal in agreement with the nature of the metal–ligand bonding and demonstrate that this theoretical study can also be applied to large fused π-systems.Keywords: electronic structure, bonding analysis, density functional theory, coordination chemistry haptotropic migration
Procedia PDF Downloads 3011558 Thermal Performance of Plate-Fin Heat Sink with Lateral Perforation
Authors: Sakkarin Chingulpitak, Somchai Wongwises
Abstract:
Over the past several decades, the development of electronic devices has led to higher performance. Therefore, an electronic cooling system is important for the electronic device. A heat sink which is a part of the electronic cooling system is continuously studied in the research field to enhance the heat transfer. To author’s best knowledge, there have been only a few articles which reported the thermal performance of plate-fin heat sink with perforation. This research aims to study on the flow and heat transfer characteristics of the solid-fin heat sink (SFHS) and laterally perforated plate-fin heat sink (LAP-PFHS). The SFHS and LAP-PFHSs are investigated on the same fin dimensions. The LAP-PFHSs are performed with a 27 perforation number and two different diameters of circular perforation (3 mm and 5 mm). The experimental study is conducted under various Reynolds numbers from 900 to 2,000 and the heat input of 50W. The experimental results show that the LAP-PFHS with perforation diameter of 5 mm gives the minimum thermal resistance about 25% lower than SFHS. The thermal performance factor which takes into account the ratio of the Nusselt number and ratio of friction factor is used to find the suitable design parameters. The experimental results show that the LAP-PFHS with the perforation diameter of 3 mm provides the thermal performance of 15% greater than SFHS. In addition, the simulation study is presented to investigate the effect of the air flow behavior inside the perforation on the thermal performance of LAP-PFHS.Keywords: heat sink, parallel flow, circular perforation, non-bypass flow
Procedia PDF Downloads 1481557 Penetrating Neck Injury: No Zone Approach
Authors: Abhishek Sharma, Amit Gupta, Manish Singhal
Abstract:
Background: The management of patients with penetrating neck injuries in the prehospital setting and in the emergency department has evolved with regard to the use of multidetector computed tomographic (MDCT) imaging. Hence, there is a shift in the management of neck injuries from mandatory exploration in certain anatomic areas to more conservative approach using imaging and so-called “no zone approach”. Objective: To study the no zone approach in the management of penetrating neck injury using routine imaging in all stable patients. Methods: 137 patients with penetrating neck injury attending emergency department of level 1 trauma centre at AIIMS between 2008–2014 were retrospectively analysed. All hemodynamically stable patients were evaluated using CT scanning. Results: Stab injury is most common (55.91%) mode of pni in civilian population followed by gunshot(18.33%). The majority of patients could be managed with imaging and close observation. 39 patients (28.46%) required operative intervention. The most common indication for operative intervention was vascular followed by airway injury manifesting as hemodynamic destabilisation.There was no statistical difference between the zonal distribution of injuries in patients managed conservatively and those taken to OR. Conclusions: Study shows that patients with penetrating neck trauma who are haemodynamically stable and exhibit no “hard signs” of vascular injury or airway injury may be evaluated initially by MDCT imaging even when platysma violation is present. “No Zone” policy may be superior to traditional zone wise management.Keywords: penetrating neck injury, zone approach, CT scanning, multidetector computed tomographic (MDCT)
Procedia PDF Downloads 4021556 Digital Curriculum Preservation Planning, Actions, and Challenges
Authors: Misook Ahn
Abstract:
This study examined the Digital Curriculum Repository (DCR) project initiated at Defense Language Institute Foreign Language Center (DLIFLC). The purpose of the DCR is to build a centralized curriculum infrastructure, preserve all curriculum materials, and provide academic service to users (faculty, students, or other agencies). The DCR collection includes core language curriculum materials developed by each language school—foreign language textbooks, language survival kits, and audio files currently in or not in use at the schools. All core curriculum materials with audio and video files have been coded, collected, and preserved at the DCR. The DCR website was designed with MS SharePoint for easy accessibility by the DLIFLC’s faculty and students. All metadata for the collected curriculum materials have been input by language, code, year, book type, level, user, version, and current status (in use/not in use). The study documents digital curriculum preservation planning, actions, and challenges, including collecting, coding, collaborating, designing DCR SharePoint, and policymaking. DCR Survey data is also collected and analyzed for this research. Based on the finding, the study concludes that the mandatory policy for the DCR system and collaboration with school leadership are critical elements of a successful repository system. The sample collected items, metadata, and DCR SharePoint site are presented in the evaluation section.Keywords: MS share point, digital preservation, repository, policy
Procedia PDF Downloads 1591555 Simulation IDM for Schedule Generation of Slip-Form Operations
Authors: Hesham A. Khalek, Shafik S. Khoury, Remon F. Aziz, Mohamed A. Hakam
Abstract:
Slipforming operation’s linearity is a source of planning complications, and operation is usually subjected to bottlenecks at any point, so careful planning is required in order to achieve success. On the other hand, Discrete-event simulation concepts can be applied to simulate and analyze construction operations and to efficiently support construction scheduling. Nevertheless, preparation of input data for construction simulation is very challenging, time-consuming and human prone-error source. Therefore, to enhance the benefits of using DES in construction scheduling, this study proposes an integrated module to establish a framework for automating the generation of time schedules and decision support for Slipform construction projects, particularly through the project feasibility study phase by using data exchange between project data stored in an Intermediate database, DES and Scheduling software. Using the stored information, proposed system creates construction tasks attribute [e.g. activities durations, material quantities and resources amount], then DES uses all the given information to create a proposal for the construction schedule automatically. This research is considered a demonstration of a flexible Slipform project modeling, rapid scenario-based planning and schedule generation approach that may be of interest to both practitioners and researchers.Keywords: discrete-event simulation, modeling, construction planning, data exchange, scheduling generation, EZstrobe
Procedia PDF Downloads 3761554 Vehicular Emission Estimation of Islamabad by Using Copert-5 Model
Authors: Muhammad Jahanzaib, Muhammad Z. A. Khan, Junaid Khayyam
Abstract:
Islamabad is the capital of Pakistan with the population of 1.365 million people and with a vehicular fleet size of 0.75 million. The vehicular fleet size is growing annually by the rate of 11%. Vehicular emissions are major source of Black carbon (BC). In developing countries like Pakistan, most of the vehicles consume conventional fuels like Petrol, Diesel, and CNG. These fuels are the major emitters of pollutants like CO, CO2, NOx, CH4, VOCs, and particulate matter (PM10). Carbon dioxide and methane are the leading contributor to the global warming with a global share of 9-26% and 4-9% respectively. NOx is the precursor of nitrates which ultimately form aerosols that are noxious to human health. In this study, COPERT (Computer program to Calculate Emissions from Road Transport) was used for vehicular emission estimation in Islamabad. COPERT is a windows based program which is developed for the calculation of emissions from the road transport sector. The emissions were calculated for the year of 2016 include pollutants like CO, NOx, VOC, and PM and energy consumption. The different variable was input to the model for emission estimation including meteorological parameters, average vehicular trip length and respective time duration, fleet configuration, activity data, degradation factor, and fuel effect. The estimated emissions for CO, CH4, CO2, NOx, and PM10 were found to be 9814.2, 44.9, 279196.7, 3744.2 and 304.5 tons respectively.Keywords: COPERT Model, emission estimation, PM10, vehicular emission
Procedia PDF Downloads 2611553 Land Suitability Approach as an Effort to Design a Sustainable Tourism Area in Pacet Mojokerto
Authors: Erina Wulansari, Bambang Soemardiono, Ispurwono Soemarno
Abstract:
Designing sustainable tourism area is defined as an attempt to design an area, that brings the natural environmental conditions as components are available with a wealth of social conditions and the conservation of natural and cultural heritage. To understanding tourism area in this study is not only focus on the location of the tourist object, but rather to a tourist attraction around the area, tourism objects such as the existence of residential area (settlement), a commercial area, public service area, and the natural environmental area. The principle of success in designing a sustainable tourism area is able to integrate and balance between the limited space and the variety of activities that’s always continuously to growth up. The limited space in this area of tourism needs to be managed properly to minimize the damage of environmental as a result of tourism activities hue. This research aims to identify space in this area of tourism through land suitability approach as an effort to create a sustainable design, especially in terms of ecological. This study will be used several analytical techniques to achieve the research objectives as superimposing analysis with GIS 9.3 software and Analysis Hierarchy Process. Expected outcomes are in the form of classification and criteria of usable space in designing embodiment tourism area. In addition, this study can provide input to the order of settlement patterns as part of the environment in the area of sustainable tourism.Keywords: sustainable tourism area, land suitability, limited space, environment, criteria
Procedia PDF Downloads 5031552 Cleaning Performance of High-Frequency, High-Intensity 360 kHz Frequency Operating in Thickness Mode Transducers
Authors: R. Vetrimurugan, Terry Lim, M. J. Goodson, R. Nagarajan
Abstract:
This study investigates the cleaning performance of high intensity 360 kHz frequency on the removal of nano-dimensional and sub-micron particles from various surfaces, uniformity of the cleaning tank and run to run variation of cleaning process. The uniformity of the cleaning tank was measured by two different methods i.e 1. ppbTM meter and 2. Liquid Particle Counting (LPC) technique. In the second method, aluminium metal spacer components was placed at various locations of the cleaning tank (such as centre, top left corner, bottom left corner, top right corner, bottom right corner) and the resultant particles removed by 360 kHz frequency was measured. The result indicates that the energy was distributed more uniformly throughout the entire cleaning vessel even at the corners and edges of the tank when megasonic sweeping technology is applied. The result also shows that rinsing the parts with 360 kHz frequency at final rinse gives lower particle counts, hence higher cleaning efficiency as compared to other frequencies. When megasonic sweeping technology is applied each piezoelectric transducers will operate at their optimum resonant frequency and generates stronger acoustic cavitational force and higher acoustic streaming velocity. These combined forces are helping to enhance the particle removal and at the same time improve the overall cleaning performance. The multiple extractions study was also carried out for various frequencies to measure the cleaning potential and asymptote value.Keywords: power distribution, megasonic sweeping, cavitation intensity, particle removal, laser particle counting, nano, submicron
Procedia PDF Downloads 4181551 Comparison of Volume of Fluid Model: Experimental and Empirical Results for Flows over Stacked Drop Manholes
Authors: Ramin Mansouri
Abstract:
The manhole is one of the types of structures that are installed at the site of change direction or change in the pipe diameter or sewage pipes as well as in step slope areas to reduce the flow velocity. In this study, the flow characteristics of hydraulic structures in a manhole structure have been investigated with a numerical model. In this research, the types of computational grid coarse, medium, and fines have been used for simulation. In order to simulate flow, k-ε model (standard, RNG, Realizable) and k-w model (standard SST) are used. Also, in order to find the best wall conditions, two types of standard and non-equilibrium wall functions were investigated. The turbulent model k-ε has the highest correlation with experimental results or all models. In terms of boundary conditions, constant speed is set for the flow input boundary, the output pressure is set in the boundaries which are in contact with the air, and the standard wall function is used for the effect of the wall function. In the numerical model, the depth at the output of the second manhole is estimated to be less than that of the laboratory and the output jet from the span. In the second regime, the jet flow collides with the manhole wall and divides into two parts, so hydraulic characteristics are the same as large vertical shaft hydraulic characteristics. In this situation, the turbulence is in a high range since it can be seen more energy loss in it. According to the results, energy loss in numerical is estimated at 9.359%, which is more than experimental data.Keywords: manhole, energy, depreciation, turbulence model, wall function, flow
Procedia PDF Downloads 821550 Parameters Affecting the Elasto-Plastic Behavior of Outrigger Braced Walls to Earthquakes
Authors: T. A. Sakr, Hanaa E. Abd-El-Mottaleb
Abstract:
Outrigger-braced wall systems are commonly used to provide high rise buildings with the required lateral stiffness for wind and earthquake resistance. The existence of outriggers adds to the stiffness and strength of walls as reported by several studies. The effects of different parameters on the elasto-plastic dynamic behavior of outrigger-braced wall systems to earthquakes are investigated in this study. Parameters investigated include outrigger stiffness, concrete strength, and reinforcement arrangement as the main design parameters in wall design. In addition to being significant to the wall behavior, such parameters may lead to the change of failure mode and the delay of crack propagation and consequently failure as the wall is excited by earthquakes. Bi-linear stress-strain relation for concrete with limited tensile strength and truss members with bi-linear stress-strain relation for reinforcement were used in the finite element analysis of the problem. The famous earthquake record, El-Centro, 1940 is used in the study. Emphasis was given to the lateral drift, normal stresses and crack pattern as behavior controlling determinants. Results indicated significant effect of the studied parameters such that stiffer outrigger, higher grade concrete and concentrating the reinforcement at wall edges enhance the behavior of the system. Concrete stresses and cracking behavior are sigbificantly enhanced while lesser drift improvements are observed.Keywords: outrigger, shear wall, earthquake, nonlinear
Procedia PDF Downloads 2831549 Comparing Student Performance on Paper-Based versus Computer-Based Formats of Standardized Tests
Authors: Jin Koo
Abstract:
During the coronavirus pandemic, there has been a further increasing demand for computer-based tests (CBT), and now it has become an important test mode. The main purpose of this study is to investigate the comparability of student scores obtained from computerized-based formats of a standardized test in the two subject areas of reading and mathematics. Also, this study investigates whether there is an interaction effect between test modes of CBT and paper-based tests (PBT) and gender/ability level in each subject area. The test used in this study is a multiple-choice standardized test for students in grades 8-11. For this study, data were collected during four test administrations: 2015-16, 2017-18, and 2020-21. This research used a one-factor between-subjects ANOVA to compute the PBT and CBT groups’ test means for each subject area (reading and mathematics). Also, 2-factor between-subjects ANOVAs were conducted to investigate examinee characteristics: gender (male and female), ethnicity (African-American, Asian, Hispanic, multi-racial, and White), and ability level (low, average, and high-ability groups). The author found that students’ test scores in the two subject areas varied across CBT and PBT by gender and ability level, meaning that gender, ethnicity, and ability level were related to the score difference. These results will be discussed according to the current testing systems. In addition, this study’s results will open up to school teachers and test developers the possible influence that gender, ethnicity, and ability level have on a student’s score based on whether they take the CBT or PBT.Keywords: ability level, computer-based, gender, paper-based, test
Procedia PDF Downloads 1001548 Multi-Stage Multi-Period Production Planning in Wire and Cable Industry
Authors: Mahnaz Hosseinzadeh, Shaghayegh Rezaee Amiri
Abstract:
This paper presents a methodology for serial production planning problem in wire and cable manufacturing process that addresses the problem of input-output imbalance in different consecutive stations, hoping to minimize the halt of machines in each stage. To this end, a linear Goal Programming (GP) model is developed, in which four main categories of constraints as per the number of runs per machine, machines’ sequences, acceptable inventories of machines at the end of each period, and the necessity of fulfillment of the customers’ orders are considered. The model is formulated based upon on the real data obtained from IKO TAK Company, an important supplier of wire and cable for oil and gas and automotive industries in Iran. By solving the model in GAMS software the optimal number of runs, end-of-period inventories, and the possible minimum idle time for each machine are calculated. The application of the numerical results in the target company has shown the efficiency of the proposed model and the solution in decreasing the lead time of the end product delivery to the customers by 20%. Accordingly, the developed model could be easily applied in wire and cable companies for the aim of optimal production planning to reduce the halt of machines in manufacturing stages.Keywords: goal programming approach, GP, production planning, serial manufacturing process, wire and cable industry
Procedia PDF Downloads 1601547 Psychological Factors Affecting Breastfeeding: An Exploratory Study among Breastfeeding Moms
Authors: Marwa Abdussalam
Abstract:
Breastfeeding is a unique emotional bond between a mother and their offspring. Though breastfeeding may be natural, it is not something mothers are born with; some still struggle to breastfeed their babies. Various factors can influence the breastfeeding experience, such as the mode of delivery, the mother’s health condition, proper latching, etc. In addition, psychological factors have been known to influence breastfeeding ability, duration, and milk supply. Some mothers struggle to breastfeed their babies because they perceive they have a low milk supply and or don’t have the ability to breastfeed their babies. Most of these perceptions result either from their own past experience or from the ‘comments’ of their caregivers. So, it is of utmost essential to understand such psychological factors affecting breastfeeding so that necessary steps can be taken to educate breastfeeding mothers. The study explored the role of psychological factors that affect breastfeeding. Data were collected from fifteen breastfeeding mothers using a semi-structured interview schedule. A total of 10 questions were included in the interview schedule. Questions were sequenced in a funnel pattern, beginning with open-ended questions and then moving on to close-ended questions. Data were analyzed using Braun and Clarke’s Thematic Analysis technique. This technique involves identifying the codes, generating themes, naming them, and finally reviewing them. Results indicated that breastfeeding self-efficacy perceived insufficient milk supply, and lack of knowledge were the psychological factors affecting breastfeeding. The results of this study can be used to help mothers who are struggling with breastfeeding by developing interventions aimed at improving breastfeeding self-efficacy.Keywords: breastfeeding, breastfeeding self-efficacy, perceived insufficient milk supply, Thematic Analysis
Procedia PDF Downloads 1081546 Prediction of Oil Recovery Factor Using Artificial Neural Network
Authors: O. P. Oladipo, O. A. Falode
Abstract:
The determination of Recovery Factor is of great importance to the reservoir engineer since it relates reserves to the initial oil in place. Reserves are the producible portion of reservoirs and give an indication of the profitability of a field Development. The core objective of this project is to develop an artificial neural network model using selected reservoir data to predict Recovery Factors (RF) of hydrocarbon reservoirs and compare the model with a couple of the existing correlations. The type of Artificial Neural Network model developed was the Single Layer Feed Forward Network. MATLAB was used as the network simulator and the network was trained using the supervised learning method, Afterwards, the network was tested with input data never seen by the network. The results of the predicted values of the recovery factors of the Artificial Neural Network Model, API Correlation for water drive reservoirs (Sands and Sandstones) and Guthrie and Greenberger Correlation Equation were obtained and compared. It was noted that the coefficient of correlation of the Artificial Neural Network Model was higher than the coefficient of correlations of the other two correlation equations, thus making it a more accurate prediction tool. The Artificial Neural Network, because of its accurate prediction ability is helpful in the correct prediction of hydrocarbon reservoir factors. Artificial Neural Network could be applied in the prediction of other Petroleum Engineering parameters because it is able to recognise complex patterns of data set and establish a relationship between them.Keywords: recovery factor, reservoir, reserves, artificial neural network, hydrocarbon, MATLAB, API, Guthrie, Greenberger
Procedia PDF Downloads 4411545 Analysis of Solvent Effect on the Mechanical Properties of Poly(Ether Ether Ketone) Using Nano-Indentation
Authors: Tanveer Iqbal, Saima Yasin, Muhammad Zafar, Ahmad Shakeel, Fahad Nazir, Paul F. Luckham
Abstract:
The contact performance of polymeric composites is dependent on the localized mechanical properties of materials. This is particularly important for fiber oriented polymeric materials where self-lubrication from top layers has been the basic requirement. The nanoindentation response of fiber reinforced poly(etheretherketone), PEEK, composites have been evaluated to determine the near-surface mechanical characteristics. Load-displacement compliance, hardness and elastic modulus data based on contact compliance mode (CSM) indentation of carbon fiber oriented and glass fiber oriented PEEK composites are reported as a function of indentation contact displacement. The composite surfaces were indented to a maximum penetration depth of 5µm using Berkovich tip indenter. A typical multiphase response of the composite surface is depicted from analysis of the indentation data for the composites, showing presence of polymer matrix, fibers, and interphase regions. The observed experimental results show that although the surface mechanical properties of carbon fiber based PEEK composite were comparatively higher, the properties of matrix material were seen to be increased in the presence of glass fibers. The experimental methodology may provide a convenient means to understand morphological description of the multimodal polymeric composites.Keywords: nanoindentation, PEEK, modulus, hardness, plasticization
Procedia PDF Downloads 1921544 Seismic Performance of Concrete Moment Resisting Frames in Western Canada
Authors: Ali Naghshineh, Ashutosh Bagchi
Abstract:
Performance-based seismic design concepts are increasingly being adopted in various jurisdictions. While the National Building Code of Canada (NBCC) is not fully performance-based, it provides some features of a performance-based code, such as displacement control and objective-based solutions. Performance evaluation is an important part of a performance-based design. In this paper, the seismic performance of a set of code-designed 4, 8 and 12 story moment resisting concrete frames located in Victoria, BC, in the western part of Canada at different hazard levels namely, SLE (Service Level Event), DLE (Design Level Event) and MCE (Maximum Considered Event) has been studied. The seismic performance of these buildings has been evaluated based on FEMA 356 and ATC 72 procedures, and the nonlinear time history analysis. Pushover analysis has been used to investigate the different performance levels of these buildings and adjust their design based on the corresponding target displacements. Since pushover analysis ignores the higher mode effects, nonlinear dynamic time history using a set of ground motion records has been performed. Different types of ground motion records, such as crustal and subduction earthquake records have been used for the dynamic analysis to determine their effects. Results obtained from push over analysis on inter-story drift, displacement, shear and overturning moment are compared to those from the dynamic analysis.Keywords: seismic performance., performance-based design, concrete moment resisting frame, crustal earthquakes, subduction earthquakes
Procedia PDF Downloads 2641543 3G or 4G: A Predilection for Millennial Generation of Indian Society
Authors: Rishi Prajapati
Abstract:
3G is the abbreviation of third generation of wireless mobile telecommunication technologies. 3G is a mode that finds application in wireless voice telephony, mobile internet access, fixed wireless internet access, video calls and mobile TV. It also provides mobile broadband access to smartphones and mobile modems in laptops and computers. The first 3G networks were introduced in 1998, followed by 4G networks in 2008. 4G is the abbreviation of fourth generation of wireless mobile telecommunication technologies. 4G is termed to be the advanced form of 3G. 4G was firstly introduced in South Korea in 2007. Many abstracts have floated researches that depicted the diversity and similarity between the third and the fourth generation of wireless mobile telecommunications technology, whereas this abstract reflects the study that focuses on analyzing the preference between 3G versus 4G given by the elite group of the Indian society who are known as adolescents or the Millennial Generation aging from 18 years to 25 years. The Millennial Generation was chosen for this study as they have the easiest access to the latest technology. A sample size of 200 adolescents was selected and a structured survey was carried out which had several closed ended as well as open ended questions, to aggregate the result of this study. It was made sure that the effect of environmental factors on the subjects was as minimal as possible. The data analysis comprised of primary data collection reflecting it as quantitative research. The rationale behind this research is to give brief idea of how 3G and 4G are accepted by the Millennial Generation in India. The findings of this research would materialize a framework which depicts whether Millennial Generation would prefer 4G over 3G or vice versa.Keywords: fourth generation, wireless telecommunication technology, Indian society, millennial generation, market research, third generation
Procedia PDF Downloads 2701542 Teaching Tools for Web Processing Services
Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr
Abstract:
Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.Keywords: deegree, interpolation, IDW, web processing service (WPS)
Procedia PDF Downloads 3551541 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil
Authors: P. S. Jain, K. D. Bobade, S. J. Surana
Abstract:
A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.Keywords: naftopidil, HPTLC, validation, stability, degradation
Procedia PDF Downloads 4001540 Patients’ Perspective on Early Discharge with Drain in situ after Breast Cancer Surgery
Authors: Laila Al-Balushi, Suad Al-Kharosui
Abstract:
Due to the increasing number of breast cancer cases in Oman and the impact of the novel coronavirus disease 2019 (COVID-19 on bed situation in the hospital, a policy of early discharge (ED) with drain after breast cancer surgery was initiated at one of the tertiary hospitals in Oman. The uniqueness of this policy is no home visit follow-up, conducted after discharge and the main mode of communication was Instagram media. This policy then was evaluated by conducting a quasi-experimental study using a questionnaire with ten open and closed-ended questions, five questions to explore patient experience using a five-point Likert scale. A total of 41 female patients responded to the questionnaire. Almost 96% of the participants stated being well informed about drain care pre- and post-surgery at home. 9% of the participants developed early sign of infection and was managed at out-patient clinics. Participants with bilateral drains expressed more pain than those with single drain. 90% stated satisfied being discharged with breast drain whereas 10% preferred to stay in the hospital until the drains were removed. This study found that the policy of ED with a drain after BC surgery is practical and well-accepted by most patients. The role of breast nurse and presence of family and institutional support enhanced the success of the policy implementation. To optimize patient care, conducting a training program by breast nurse for nurses at local health centres about care management of patients with drain could improve care and enhance patient satisfaction.Keywords: breast cancer, surgery, early discharge, surgical drain
Procedia PDF Downloads 951539 Using Machine Learning to Classify Human Fetal Health and Analyze Feature Importance
Authors: Yash Bingi, Yiqiao Yin
Abstract:
Reduction of child mortality is an ongoing struggle and a commonly used factor in determining progress in the medical field. The under-5 mortality number is around 5 million around the world, with many of the deaths being preventable. In light of this issue, Cardiotocograms (CTGs) have emerged as a leading tool to determine fetal health. By using ultrasound pulses and reading the responses, CTGs help healthcare professionals assess the overall health of the fetus to determine the risk of child mortality. However, interpreting the results of the CTGs is time-consuming and inefficient, especially in underdeveloped areas where an expert obstetrician is hard to come by. Using a support vector machine (SVM) and oversampling, this paper proposed a model that classifies fetal health with an accuracy of 99.59%. To further explain the CTG measurements, an algorithm based on Randomized Input Sampling for Explanation ((RISE) of Black-box Models was created, called Feature Alteration for explanation of Black Box Models (FAB), and compared the findings to Shapley Additive Explanations (SHAP) and Local Interpretable Model Agnostic Explanations (LIME). This allows doctors and medical professionals to classify fetal health with high accuracy and determine which features were most influential in the process.Keywords: machine learning, fetal health, gradient boosting, support vector machine, Shapley values, local interpretable model agnostic explanations
Procedia PDF Downloads 1441538 Spatiotemporal Analysis of Visual Evoked Responses Using Dense EEG
Authors: Rima Hleiss, Elie Bitar, Mahmoud Hassan, Mohamad Khalil
Abstract:
A comprehensive study of object recognition in the human brain requires combining both spatial and temporal analysis of brain activity. Here, we are mainly interested in three issues: the time perception of visual objects, the ability of discrimination between two particular categories (objects vs. animals), and the possibility to identify a particular spatial representation of visual objects. Our experiment consisted of acquiring dense electroencephalographic (EEG) signals during a picture-naming task comprising a set of objects and animals’ images. These EEG responses were recorded from nine participants. In order to determine the time perception of the presented visual stimulus, we analyzed the Event Related Potentials (ERPs) derived from the recorded EEG signals. The analysis of these signals showed that the brain perceives animals and objects with different time instants. Concerning the discrimination of the two categories, the support vector machine (SVM) was applied on the instantaneous EEG (excellent temporal resolution: on the order of millisecond) to categorize the visual stimuli into two different classes. The spatial differences between the evoked responses of the two categories were also investigated. The results showed a variation of the neural activity with the properties of the visual input. Results showed also the existence of a spatial pattern of electrodes over particular regions of the scalp in correspondence to their responses to the visual inputs.Keywords: brain activity, categorization, dense EEG, evoked responses, spatio-temporal analysis, SVM, time perception
Procedia PDF Downloads 4221537 Time Series Forecasting (TSF) Using Various Deep Learning Models
Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan
Abstract:
Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed-length window in the past as an explicit input. In this paper, we study how the performance of predictive models changes as a function of different look-back window sizes and different amounts of time to predict the future. We also consider the performance of the recent attention-based Transformer models, which have had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (RNN, LSTM, GRU, and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the UCI website, which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Average Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.Keywords: air quality prediction, deep learning algorithms, time series forecasting, look-back window
Procedia PDF Downloads 1541536 Static Analysis of Security Issues of the Python Packages Ecosystem
Authors: Adam Gorine, Faten Spondon
Abstract:
Python is considered the most popular programming language and offers its own ecosystem for archiving and maintaining open-source software packages. This system is called the python package index (PyPI), the repository of this programming language. Unfortunately, one-third of these software packages have vulnerabilities that allow attackers to execute code automatically when a vulnerable or malicious package is installed. This paper contributes to large-scale empirical studies investigating security issues in the python ecosystem by evaluating package vulnerabilities. These provide a series of implications that can help the security of software ecosystems by improving the process of discovering, fixing, and managing package vulnerabilities. The vulnerable dataset is generated using the NVD, the national vulnerability database, and the Snyk vulnerability dataset. In addition, we evaluated 807 vulnerability reports in the NVD and 3900 publicly known security vulnerabilities in Python Package Manager (pip) from the Snyk database from 2002 to 2022. As a result, many Python vulnerabilities appear in high severity, followed by medium severity. The most problematic areas have been improper input validation and denial of service attacks. A hybrid scanning tool that combines the three scanners bandit, snyk and dlint, which provide a clear report of the code vulnerability, is also described.Keywords: Python vulnerabilities, bandit, Snyk, Dlint, Python package index, ecosystem, static analysis, malicious attacks
Procedia PDF Downloads 139