Search results for: time series data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38110

Search results for: time series data mining

32890 An Investigation on Smartphone-Based Machine Vision System for Inspection

Authors: They Shao Peng

Abstract:

Machine vision system for inspection is an automated technology that is normally utilized to analyze items on the production line for quality control purposes, it also can be known as an automated visual inspection (AVI) system. By applying automated visual inspection, the existence of items, defects, contaminants, flaws, and other irregularities in manufactured products can be easily detected in a short time and accurately. However, AVI systems are still inflexible and expensive due to their uniqueness for a specific task and consuming a lot of set-up time and space. With the rapid development of mobile devices, smartphones can be an alternative device for the visual system to solve the existing problems of AVI. Since the smartphone-based AVI system is still at a nascent stage, this led to the motivation to investigate the smartphone-based AVI system. This study is aimed to provide a low-cost AVI system with high efficiency and flexibility. In this project, the object detection models, which are You Only Look Once (YOLO) model and Single Shot MultiBox Detector (SSD) model, are trained, evaluated, and integrated with the smartphone and webcam devices. The performance of the smartphone-based AVI is compared with the webcam-based AVI according to the precision and inference time in this study. Additionally, a mobile application is developed which allows users to implement real-time object detection and object detection from image storage.

Keywords: automated visual inspection, deep learning, machine vision, mobile application

Procedia PDF Downloads 109
32889 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood

Authors: Randa Alharbi, Vladislav Vyshemirsky

Abstract:

Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.

Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)

Procedia PDF Downloads 190
32888 The Incidence of Cardiac Arrhythmias Using Trans-Telephonic, Portable Electrocardiography Recorder, in Out-Patients Faculty of Medicine Ramathibodi Hospital

Authors: Urasri Imsomboon, Sopita Areerob, Kanchaporn Kongchauy, Tuchapong Ngarmukos

Abstract:

Objective: The Trans-telephonic Electrocardiography (ECG) monitoring is used to diagnose of infrequent cardiac arrhythmias and improve outcome of early detection and treatment on suspected cardiac patients. The objectives of this study were to explore incidence of cardiac arrhythmia using Trans-Telephonic and to explore time to first symptomatic episode and documented cardiac arrhythmia in outpatients. Methods: Descriptive research study was conducted between February 1, 2016, and December 31, 2016. A total of 117 patients who visited outpatient clinic were purposively selected. Research instruments in this study were the personal data questionnaire and the record form of incidence of cardiac arrhythmias using Trans-Telephonic ECG recorder. Results: A total of 117 patients aged between 15-92 years old (mean age 52.7 ±17.1 years), majority of studied sample was women (64.1%). The results revealed that 387 ECGs (Average 2.88 ECGs/person, SD = 3.55, Range 0 – 21) were sent to Cardiac Monitoring Center at Coronary Care Unit. Of these, normal sinus rhythm was found mostly 46%. Top 5 of cardiac arrhythmias were documented at the time of symptoms: sinus tachycardia 43.5%, premature atrial contraction 17.7%, premature ventricular contraction 14.3%, sinus bradycardia 11.5% and atrial fibrillation 8.6%. Presenting symptom were tachycardia 94%, palpitation 83.8%, dyspnea 51.3%, chest pain 19.6%, and syncope 14.5%. Mostly activities during symptom were no activity 64.8%, sleep 55.6% and work 25.6%.The mean time until the first symptomatic episode occurred on average after 6.88 ± 7.72 days (median 3 days). The first documented cardiac arrhythmia occurred on average after 9 ± 7.92 days (median 7 day). The treatments after patients known actual cardiac arrhythmias were observe themselves 68%, continue same medications 15%, got further investigations (7 patients), and corrected causes of cardiac arrhythmias via invasive cardiac procedures (5 patients). Conclusion: Trans-telephonic: portable ECGs recorder is effective in the diagnosis of suspected symptomatic cardiac arrhythmias in outpatient clinic.

Keywords: cardiac arrhythmias, diagnosis, outpatient clinic, trans-telephonic: portable ECG recorder

Procedia PDF Downloads 180
32887 Time-Dependent Density Functional Theory of an Oscillating Electron Density around a Nanoparticle

Authors: Nilay K. Doshi

Abstract:

A theoretical probe describing the excited energy states of the electron density surrounding a nanoparticle (NP) is presented. An electromagnetic (EM) wave interacts with a NP much smaller than the incident wavelength. The plasmon that oscillates locally around the NP comprises of excited conduction electrons. The system is based on the Jellium model of a cluster of metal atoms. Hohenberg-Kohn (HK) equations and the variational Kohn-Sham (SK) scheme have been used to obtain the NP electron density in the ground state. Furthermore, a time-dependent density functional (TDDFT) theory is used to treat the excited states in a density functional theory (DFT) framework. The non-interacting fermionic kinetic energy is shown to be a functional of the electron density. The time dependent potential is written as the sum of the nucleic potential and the incoming EM field. This view of the quantum oscillation of the electron density is a part of the localized surface plasmon resonance.

Keywords: electron density, energy, electromagnetic, DFT, TDDFT, plasmon, resonance

Procedia PDF Downloads 315
32886 Healthy, Breast Fed Bangladeshi Children Can Regulate Their Food Consumption in Each Meal and Feeding Duration When Offered with Varied Energy Density and Feeding Frequency of Complementary Foods

Authors: M. Munirul Islam, Makhduma Khatun M., Janet M. Peerson, Tahmeed Ahmed, M. Abid Hossain Mollah, Kathryn G. Dewey, Kenneth H. Brown

Abstract:

Information is required on the effects of dietary energy density (ED) and feeding frequency (FF) of complementary foods (CF) on food consumption during individual meals and time expended in child feeding. We evaluated the effects of varied ED and FF of CFs on food intake and time required for child feeding during individual meals. During 9 separate, randomly ordered dietary periods lasting 3-6 days each, we measured self-determined intakes of porridges by 18 healthy, breastfed children 8-11 mo old who were fed coded porridges with energy densities of 0.5, 1.0 or 1.5 kcal/g, during 3, 4, or 5 meals/d. CF intake was measured by weighing the feeding bowl before and after every meal. Children consumed greater amounts of CFs per meal when they received diets with lower ED (p = 0.044) and fewer meals per day (p < 0.001). Food intake was less during the first meal of the day than the other meals. Greater time was expended per meal when fewer meals were offered. Time expended per meal did not vary by ED, but the children ate the lower ED diets faster (p = 0.019). Food intake velocity was also greater when more meals were offered per day (p = 0.005). These results provide further evidence of young children’s ability to regulate their energy intakes, even during infancy; and they convey information on factors that affect the amount of time that caregivers must devote to child feeding.

Keywords: complementary foods, energy density, feeding frequency, young children

Procedia PDF Downloads 451
32885 Surface and Bulk Magnetization Behavior of Isolated Ferromagnetic NiFe Nanowires

Authors: Musaab Salman Sultan

Abstract:

The surface and bulk magnetization behavior of template released isolated ferromagnetic Ni60Fe40 nanowires of relatively thick diameters (~200 nm), deposited from a dilute suspension onto pre-patterned insulating chips have been investigated experimentally, using a highly sensitive Magneto-Optical Ker Effect (MOKE) magnetometry and Magneto-Resistance (MR) measurements, respectively. The MR data were consistent with the theoretical predictions of the anisotropic magneto-resistance (AMR) effect. The MR measurements, in all the angles of investigations, showed large features and a series of nonmonotonic "continuous small features" in the resistance profiles. The extracted switching fields from these features and from MOKE loops were compared with each other and with the switching fields reported in the literature that adopted the same analytical techniques on the similar compositions and dimensions of nanowires. A large difference between MOKE and MR measurments was noticed. The disparate between MOKE and MR results is attributed to the variance in the micro-magnetic structure of the surface and the bulk of such ferromagnetic nanowires. This result was ascertained using micro-magnetic simulations on an individual: cylindrical and rectangular cross sections NiFe nanowires, with the same diameter/thickness of the experimental wires, using the Object Oriented Micro-magnetic Framework (OOMMF) package where the simulated loops showed different switching events, indicating that such wires have different magnetic states in the reversal process and the micro-magnetic spin structures during switching behavior was complicated. These results further supported the difference between surface and bulk magnetization behavior in these nanowires. This work suggests that a combination of MOKE and MR measurements is required to fully understand the magnetization behavior of such relatively thick isolated cylindrical ferromagnetic nanowires.

Keywords: MOKE magnetometry, MR measurements, OOMMF package, micromagnetic simulations, ferromagnetic nanowires, surface magnetic properties

Procedia PDF Downloads 239
32884 On-Chip Sensor Ellipse Distribution Method and Equivalent Mapping Technique for Real-Time Hardware Trojan Detection and Location

Authors: Longfei Wang, Selçuk Köse

Abstract:

Hardware Trojan becomes great concern as integrated circuit (IC) technology advances and not all manufacturing steps of an IC are accomplished within one company. Real-time hardware Trojan detection is proven to be a feasible way to detect randomly activated Trojans that cannot be detected at testing stage. On-chip sensors serve as a great candidate to implement real-time hardware Trojan detection, however, the optimization of on-chip sensors has not been thoroughly investigated and the location of Trojan has not been carefully explored. On-chip sensor ellipse distribution method and equivalent mapping technique are proposed based on the characteristics of on-chip power delivery network in this paper to address the optimization and distribution of on-chip sensors for real-time hardware Trojan detection as well as to estimate the location and current consumption of hardware Trojan. Simulation results verify that hardware Trojan activation can be effectively detected and the location of a hardware Trojan can be efficiently estimated with less than 5% error for a realistic power grid using our proposed methods. The proposed techniques therefore lay a solid foundation for isolation and even deactivation of hardware Trojans through accurate location of Trojans.

Keywords: hardware trojan, on-chip sensor, power distribution network, power/ground noise

Procedia PDF Downloads 378
32883 Burnback Analysis of Star Grain Using Level-Set Technique

Authors: Ali Yasin, Ali Kamran, Muhammad Safdar

Abstract:

In order to reduce the hefty cost involved in terms of time and project cost, the development and application of advanced numerical tools to address the burn-back analysis problem in solid rocket motor design and development is the need of time. Several advanced numerical schemes have been developed in recent times, but their usage in the design of propellant grain of solid rocket motors is very rare. In this paper, an advanced numerical technique named the Level-Set method has been utilized for the burn-back analysis of star grain to study the effect of geometrical parameters on ballistic performance indicators such as solid loading, neutrality, and sliver percentage. In the level set technique, simple finite difference methods may fail quickly and require more sophisticated non-oscillatory schemes for feasible long-time simulation. For internal ballistic calculations, a simplified equilibrium pressure method is utilized. Preliminary results of the operative conditions, for all the combustion time, of star grain burn-back using level set techniques are compared with published results using CAD technique to test the developed numerical model.

Keywords: solid rocket motor, internal ballistic, level-set technique, star grain

Procedia PDF Downloads 110
32882 Barriers to the Implementation of Peace Education in Secondary Schools, South Africa

Authors: Ntokozo Dennis Ndwandwe

Abstract:

The aim of the study was to explore the barriers facing the implementation of peace education as a strategy to combat violence in selected secondary schools in the Western Cape Province of South Africa. The problem that motivated this enquiry was the absence of stable peace and the increase of incidents of violence in schools. A qualitative approach was followed when conducting the study, and small samples of three case studies of secondary schools were used. Method used in collecting data consisted of semi-structured interviews; focus group interviews and observation. The participants consisted of the program manager for Quaker for Peace Centre (QPC), three principals, nine teachers, and fifteen learners. Data were analysed by transcribing, organising, marking by hand and coding that produced labels that allowed key points to be highlighted. Findings revealed that the effective implementation of peace education was being constrained by factors such as financial constraints, inadequate time allocated, lack of parental involvement, over work-loaded teachers, negative attitude and other societal influences. It is recommended that teachers should receive an ongoing training for peace education. Therefore, the government should prioritise and provide funds for peace education. In addition, parental involvement should be improved in order to enhance the implementation of peace education in selected secondary schools.

Keywords: barriers, implementation, conflict, peace, peace education, conflict resolution, violence

Procedia PDF Downloads 183
32881 Weapon-Being: Weaponized Design and Object-Oriented Ontology in Hypermodern Times

Authors: John Dimopoulos

Abstract:

This proposal attempts a refabrication of Heidegger’s classic thing-being and object-being analysis in order to provide better ontological tools for understanding contemporary culture, technology, and society. In his work, Heidegger sought to understand and comment on the problem of technology in an era of rampant innovation and increased perils for society and the planet. Today we seem to be at another crossroads in this course, coming after postmodernity, during which dreams and dangers of modernity augmented with critical speculations of the post-war era take shape. The new era which we are now living in, referred to as hypermodernity by researchers in various fields such as architecture and cultural theory, is defined by the horizontal implementation of digital technologies, cybernetic networks, and mixed reality. Technology today is rapidly approaching a turning point, namely the point of no return for humanity’s supervision over its creations. The techno-scientific civilization of the 21st century creates a series of problems, progressively more difficult and complex to solve and impossible to ignore, climate change, data safety, cyber depression, and digital stress being some of the most prevalent. Humans often have no other option than to address technology-induced problems with even more technology, as in the case of neuron networks, machine learning, and AI, thus widening the gap between creating technological artifacts and understanding their broad impact and possible future development. As all technical disciplines and particularly design, become enmeshed in a matrix of digital hyper-objects, a conceptual toolbox that allows us to handle the new reality becomes more and more necessary. Weaponized design, prevalent in many fields, such as social and traditional media, urban planning, industrial design, advertising, and the internet in general, hints towards an increase in conflicts. These conflicts between tech companies, stakeholders, and users with implications in politics, work, education, and production as apparent in the cases of Amazon workers’ strikes, Donald Trump’s 2016 campaign, Facebook and Microsoft data scandals, and more are often non-transparent to the wide public’s eye, thus consolidating new elites and technocratic classes and making the public scene less and less democratic. The new category proposed, weapon-being, is outlined in respect to the basic function of reducing complexity, subtracting materials, actants, and parameters, not strictly in favor of a humanistic re-orientation but in a more inclusive ontology of objects and subjects. Utilizing insights of Object-Oriented Ontology (OOO) and its schematization of technological objects, an outline for a radical ontology of technology is approached.

Keywords: design, hypermodernity, object-oriented ontology, weapon-being

Procedia PDF Downloads 140
32880 The Introduction of Modern Diagnostic Techniques and It Impact on Local Garages

Authors: Mustapha Majid

Abstract:

Gone were the days when technicians/mechanics will have to spend too much time trying to identify a mechanical fault and rectify the problem. Now the emphasis is on the use of Automobile diagnosing Equipment through the use of computers and special software. An investigation conducted at Tamale Metropolis and Accra in the Northern and Greater Accra regions of Ghana, respectively. Methodology for data gathering were; questionnaires, physical observation, interviews, and newspaper. The study revealed that majority of mechanics lack computer skills which can enable them use diagnosis tools such as Exhaust Gas Analyzer, Scan Tools, Electronic Wheel Balancing machine, etc.

Keywords: diagnosing, local garages and modern garages, lack of knowledge of diagnosing posing an existential threat, training of local mechanics

Procedia PDF Downloads 143
32879 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations

Authors: Hailye Tekleselassie

Abstract:

Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.

Keywords: IoT, data, security, edge computing

Procedia PDF Downloads 73
32878 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks

Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh

Abstract:

In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.

Keywords: aggregation, estimation, queuing, wireless sensor network

Procedia PDF Downloads 171
32877 Risk and Emotion: Measuring the Effect of Emotion and Other Visceral Factors on Decision Making under Risk

Authors: Michael Mihalicz, Aziz Guergachi

Abstract:

Background: The science of modelling choice preferences has evolved over centuries into an interdisciplinary field contributing to several branches of Microeconomics and Mathematical Psychology. Early theories in Decision Science rested on the logic of rationality, but as it and related fields matured, descriptive theories emerged capable of explaining systematic violations of rationality through cognitive mechanisms underlying the thought processes that guide human behaviour. Cognitive limitations are not, however, solely responsible for systematic deviations from rationality and many are now exploring the effect of visceral factors as the more dominant drivers. The current study builds on the existing literature by exploring sleep deprivation, thermal comfort, stress, hunger, fear, anger and sadness as moderators to three distinct elements that define individual risk preference under Cumulative Prospect Theory. Methodology: This study is designed to compare the risk preference of participants experiencing an elevated affective or visceral state to those in a neutral state using nonparametric elicitation methods across three domains. Two experiments will be conducted simultaneously using different methodologies. The first will determine visceral states and risk preferences randomly over a two-week period by prompting participants to complete an online survey remotely. In each round of questions, participants will be asked to self-assess their current state using Visual Analogue Scales before answering a series of lottery-style elicitation questions. The second experiment will be conducted in a laboratory setting using psychological primes to induce a desired state. In this experiment, emotional states will be recorded using emotion analytics and used a basis for comparison between the two methods. Significance: The expected results include a series of measurable and systematic effects on the subjective interpretations of gamble attributes and evidence supporting the proposition that a portion of the variability in human choice preferences unaccounted for by cognitive limitations can be explained by interacting visceral states. Significant results will promote awareness about the subconscious effect that emotions and other drive states have on the way people process and interpret information, and can guide more effective decision making by informing decision-makers of the sources and consequences of irrational behaviour.

Keywords: decision making, emotions, prospect theory, visceral factors

Procedia PDF Downloads 138
32876 Virtual Metrology for Copper Clad Laminate Manufacturing

Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho

Abstract:

In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.

Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology

Procedia PDF Downloads 342
32875 Sexual Health And Male Fertility: Improving Sperm Health With Focus On Technology

Authors: Diana Peninger

Abstract:

Over 10% of couples in the U.S. have infertility problems, with roughly 40% traceable to the male partner. Yet, little attention has been given to improving men’s contribution to the conception process. One solution that is showing promise in increasing conception rates for IVF and other assisted reproductive technology treatments is a first-of-its-kind semen collection that has been engineered to mitigate sperm damage caused by traditional collection methods. Patients are able to collect semen at home and deliver to clinics within 48 hours for use in fertility analysis and treatment, with less stress and improved specimen viability. This abstract will share these findings along with expert insight and tips to help attendees understand the key role sperm collection plays in addressing and treating reproductive issues, while helping to improve patient outcomes and success. Our research was to determine if male reproductive outcomes can be increased by improving sperm specimen health with a focus on technology. We utilized a redesigned semen collection cup (patented as the Device for Improved Semen Collection/DISC—U.S. Patent 6864046 – known commercially as a ProteX) that met a series of physiological parameters. Previous research demonstrated significant improvement in semen perimeters (motility forward, progression, viability, and longevity) and overall sperm biochemistry when the DISC is used for collection. Animal studies have also shown dramatic increases in pregnancy rates. Our current study compares samples collected in the DISC, next-generation DISC (DISCng), and a standard specimen cup (SSC), dry, with the 1 mL measured amount of media and media in excess ( 5mL). Both human and animal testing will be included. With sperm counts declining at alarming rates due to environmental, lifestyle, and other health factors, accurate evaluations of sperm health are critical to understanding reproductive health, origins, and treatments of infertility. An increase in the health of the sperm as measured by extensive semen parameter analysis and improved semen parameters stable for 48 hours, expanding the processing time from 1 hour to 48 hours were also demonstrated.

Keywords: reprodutive, sperm, male, infertility

Procedia PDF Downloads 118
32874 Extraction of Natural Colorant from the Flowers of Flame of Forest Using Ultrasound

Authors: Sunny Arora, Meghal A. Desai

Abstract:

An impetus towards green consumerism and implementation of sustainable techniques, consumption of natural products and utilization of environment friendly techniques have gained accelerated acceptance. Butein, a natural colorant, has many medicinal properties apart from its use in dyeing industries. Extraction of butein from the flowers of flame of forest was carried out using ultrasonication bath. Solid loading (2-6 g), extraction time (30-50 min), volume of solvent (30-50 mL) and types of solvent (methanol, ethanol and water) have been studied to maximize the yield of butein using the Taguchi method. The highest yield of butein 4.67% (w/w) was obtained using 4 g of plant material, 40 min of extraction time and 30 mL volume of methanol as a solvent. The present method provided a greater reduction in extraction time compared to the conventional method of extraction. Hence, the outcome of the present investigation could further be utilized to develop the method at a higher scale.

Keywords: butein, flowers of Flame of the Forest, Taguchi method, ultrasonic bath

Procedia PDF Downloads 460
32873 Research on Morning Commuting Behavior under Autonomous Vehicle Environment Based on Activity Method

Authors: Qing Dai, Zhengkui Lin, Jiajia Zhang, Yi Qu

Abstract:

Based on activity method, this paper focuses on morning commuting behavior when commuters travel with autonomous vehicles (AVs). Firstly, a net utility function of commuters is constructed by the activity utility of commuters at home, in car and at workplace, and the disutility of travel time cost and that of schedule delay cost. Then, this net utility function is applied to build an equilibrium model. Finally, under the assumption of constant marginal activity utility, the properties of equilibrium are analyzed. The results show that, in autonomous driving, the starting and ending time of morning peak and the number of commuters who arrive early and late at workplace are the same as those in manual driving. In automatic driving, however, the departure rate of arriving early at workplace is higher than that of manual driving, while the departure rate of arriving late is just the opposite. In addition, compared with manual driving, the departure time of arriving at workplace on time is earlier and the number of people queuing at the bottleneck is larger in automatic driving. However, the net utility of commuters and the total net utility of system in automatic driving are greater than those in manual driving.

Keywords: autonomous cars, bottleneck model, activity utility, user equilibrium

Procedia PDF Downloads 98
32872 Multi-Objective Optimization for the Green Vehicle Routing Problem: Approach to Case Study of the Newspaper Distribution Problem

Authors: Julio C. Ferreira, Maria T. A. Steiner

Abstract:

The aim of this work is to present a solution procedure referred to here as the Multi-objective Optimization for Green Vehicle Routing Problem (MOOGVRP) to provide solutions for a case study. The proposed methodology consists of three stages to resolve Scenario A. Stage 1 consists of the “treatment” of data; Stage 2 consists of applying mathematical models of the p-Median Capacitated Problem (with the objectives of minimization of distances and homogenization of demands between groups) and the Asymmetric Traveling Salesman Problem (with the objectives of minimizing distances and minimizing time). The weighted method was used as the multi-objective procedure. In Stage 3, an analysis of the results is conducted, taking into consideration the environmental aspects related to the case study, more specifically with regard to fuel consumption and air pollutant emission. This methodology was applied to a (partial) database that addresses newspaper distribution in the municipality of Curitiba, Paraná State, Brazil. The preliminary findings for Scenario A showed that it was possible to improve the distribution of the load, reduce the mileage and the greenhouse gas by 17.32% and the journey time by 22.58% in comparison with the current scenario. The intention for future works is to use other multi-objective techniques and an expanded version of the database and explore the triple bottom line of sustainability.

Keywords: Asymmetric Traveling Salesman Problem, Green Vehicle Routing Problem, Multi-objective Optimization, p-Median Capacitated Problem

Procedia PDF Downloads 97
32871 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 347
32870 A Method to Identify the Critical Delay Factors for Building Maintenance Projects of Institutional Buildings: Case Study of Eastern India

Authors: Shankha Pratim Bhattacharya

Abstract:

In general building repair and renovation projects are minor in nature. It requires less attention as the primary cost involvement is relatively small. Although the building repair and maintenance projects look simple, it involves much complexity during execution. Many of the present research indicate that few uncertain situations are usually linked with maintenance projects. Those may not be read properly in the planning stage of the projects, and finally, lead to time overrun. Building repair and maintenance become essential and periodical after commissioning of the building. In Institutional buildings, the regular maintenance projects also include addition –alteration, modification activities. Increase in the student admission, new departments, and sections, new laboratories and workshops, up gradation of existing laboratories are very common in the institutional buildings in the developing nations like India. The project becomes very critical because it undergoes space problem, architectural design issues, structural modification, etc. One of the prime factors in the institutional building maintenance and modification project is the time constraint. Mostly it required being executed a specific non-work time period. The present research considered only the institutional buildings of the Eastern part of India to analyse the repair and maintenance project delay. A general survey was conducted among the technical institutes to find the causes and corresponding nature of construction delay factors. Five technical institutes are considered in the present study with repair, renovation, modification and extension type of projects. Construction delay factors are categorically subdivided into four groups namely, material, manpower (works), Contract and Site. The survey data are collected for the nature of delay responsible for a specific project and the absolute amount of delay through proposed and actual duration of work. In the first stage of the paper, a relative importance index (RII) is proposed for the delay factors. The occurrence of the delay factors is also judged by its frequency-severity nature. Finally, the delay factors are then rated and linked with the type of work. In the second stage, a regression analysis is executed to establish an empirical relationship between the actual time of a project and the percentage of delay. It also indicates the impact of the factors for delay responsibility. Ultimately, the present paper makes an effort to identify the critical delay factors for the repair and renovation type project in the Eastern Indian Institutional building.

Keywords: delay factor, institutional building, maintenance, relative importance index, regression analysis, repair

Procedia PDF Downloads 239
32869 Intraventricular Hemorrhage Caused by Subarachnoid Hemorrhage; When Time Is Life

Authors: Devieta Romadhon Saendardy

Abstract:

Introduction: The case of aneurysmal subarachnoid hemorrhage (SAH) associated with intraventricular hemorrhage (IVH) in many way. In general, the anterior communicating artery and posterior circulation aneurysms cause Intraventricular Hemorrhage. The development of intraventricular hemorrhage (IVH) in aneurysmal subarachnoid hemorrhage (aSAH) is linked with higher mortality and poor neurological recovery. Case: This case report presents a 51-year-old female patient who developed IVH following SAH. The patient's Glasgow Coma Scale score was 14, the patient has a severe headache, and there were right extremity hemipharese neurological deficits. A non-contrast head CT scan revealed a massive intraventricular haemorrhage. In an hour, the patient got her headache and pharese worse. Discussion: Intraventricular hemorrhage is a serious complication of subarachnoid hemorrhage, necessitating prompt recognition and management. This case highlights the importance of a time management, medical management and surgical intervention to optimize outcomes in patients with intraventricular hemorrhage caused by subarachnoid hemorrhage. Placement of a shunt system improves clinical outcome in intraventricular hemorrhage.

Keywords: Intraventricular hemorrhage, subarachnoid hemorrhage, shunt, time

Procedia PDF Downloads 55
32868 Temporality, Place and Autobiography in J.M. Coetzee’s 'Summertime'

Authors: Barbara Janari

Abstract:

In this paper it is argued that the effect of the disjunctive temporality in Summertime (the third of J.M. Coetzee’s fictionalised memoirs) is two-fold: firstly, it reflects the memoir’s ambivalent, contradictory representations of place in order to emphasize the fractured sense of self growing up in South Africa during apartheid entailed for Coetzee. Secondly, it reconceives the autobiographical discourse as one that foregrounds the inherent fictionality of all texts. The memoir’s narrative is filtered through intricate textual strategies that disrupt the chronological movement of the narrative, evoking the labyrinthine ways in which the past and present intersect and interpenetrate each other. It is framed by entries from Coetzee’s Notebooks: it opens with entries that cover the years 1972–1975, and ends with a number of undated fragments from his Notebooks. Most of the entries include a short ‘memo’ at the end, added between 1999 and 2000. While the memos follow the Notebook entries in the text, they are separated by decades. Between the Notebook entries is a series of interviews conducted by Vincent, the text’s putative biographer, between 2007 and 2008, based on recollections from five people who had known Coetzee in the 1970s – a key period in John’s life as it marks both his return to South Africa after a failed emigration attempt to America, and the beginning of his writing career, with the publication of Dusklands in 1974. The relationship between the memoir’s various parts is a key feature of Coetzee’s representation of place in Summertime, which is constructed as a composite one in which the principle of reflexive referencing has to be adopted. In other words, readers have to suspend individual references temporarily until the relationships between the parts have been connected to each other. In order to apprehend meaning in the text, the disparate narrative elements have to first be tied together. In this text, then, the experience of time as ordered and chronological is ruptured. Instead, the memoir’s themes and patterns become apparent most clearly through reflexive referencing, by which relationships between disparate sections of the text are linked. The image of the fictional John that emerges from the text is a composite of this John and the author, J.M. Coetzee, and is one which embodies Coetzee’s often fraught relationship with his home country, South Africa.

Keywords: autobiography, place, reflexive referencing, temporality

Procedia PDF Downloads 56
32867 Vocational and Technical Educators’ Acceptance and Use of Digital Learning Environments Beyond Working Hours: Implications for Work-Life Balance and the Role of Integration Preference

Authors: Jacinta Ifeoma Obidile

Abstract:

Teachers (vocational and technical educators inclusive) use Information and Communications Technology (ICT) for tasks outside of their normal working hours. This expansion of work duties to non-work time challenges their work-life balance. However, there has been inconsistency in the results on how these relationships correlate. This, therefore, calls for further research studies to examine the moderating mechanisms of such relationships. The present study, therefore, ascertained how vocational and technical educators’ technology acceptance relates to their work-related ICT use beyond their working hours and work-life balance, as well as how their integration affects these relationships. The population of the study comprised 320 Vocational and Technical Educators from the Southeast geopolitical zone of Nigeria. Data were collected from the respondents using the structured questionnaire. The questionnaire was validated by three experts. The reliability of the study was conducted using 20 vocational and technical educators from the South who were not part of the population. The overall reliability coefficient of 0.81 was established using Cronbach’s alpha method. The data collected was analyzed using Structural equation modeling. Findings, among others, revealed that vocational and technical educators’ work-life balance was mediated by increased digital learning environment use after work hours, although reduced by social influence.

Keywords: vocational and technical educators, digital learning environment, working hours, work-life balance, integration preference

Procedia PDF Downloads 37
32866 Assessment and Evaluation of Traffic Noise in Selected Government Healthcare Facilities at Birnin Kebbi, Kebbi State-Nigeria

Authors: Muhammad Naziru Yahaya, Buhari Samaila, Nasiru Abubakar

Abstract:

Noise pollution caused by vehicular movement in urban cities has reached alarming proportions due to continuous increases in vehicles and industrialization. Traffic noise causes deafness, annoyance, and other health challenges. According to World Health Organization recommends 60Db daytime sound levels and 40db night time sound levels in hospitals, schools, and other residential areas. Measurements of traffic noise were taken at six different locations of selected healthcare facilities at Birnin Kebbi (Sir Yahaya Memorial Hospital and Federal Medical Centre Birnin Kebbi). The data was collected in the vicinity of hospitals using the slow setting of the device and pointed at noise sources. An integrated multifunctional sound level GM1352, KK2821163 model, was used for measuring the emitted noise and temperatures. The data was measured and recorded at three different periods of the day 8 am – 12 pm, 3 pm – 6 pm, and 6 pm – 8:30 pm, respectively. The results show that a fair traffic flow producing an average sound level in the order of 38db – 64db was recorded at GOPDF, amenityF, and ante-natalF. Similarly, high traffic noise was observed at GOPDS, amenityS, and Fati-LamiS in the order of 52db – 78db unsatisfactory threshold for human hearing.

Keywords: amenities, healthcare, noise, hospital, traffic

Procedia PDF Downloads 91
32865 Synthesis of Filtering in Stochastic Systems on Continuous-Time Memory Observations in the Presence of Anomalous Noises

Authors: S. Rozhkova, O. Rozhkova, A. Harlova, V. Lasukov

Abstract:

We have conducted the optimal synthesis of root-mean-squared objective filter to estimate the state vector in the case if within the observation channel with memory the anomalous noises with unknown mathematical expectation are complement in the function of the regular noises. The synthesis has been carried out for linear stochastic systems of continuous-time.

Keywords: mathematical expectation, filtration, anomalous noise, memory

Procedia PDF Downloads 228
32864 Selection of Suitable Reference Genes for Assessing Endurance Related Traits in a Native Pony Breed of Zanskar at High Altitude

Authors: Prince Vivek, Vijay K. Bharti, Manishi Mukesh, Ankita Sharma, Om Prakash Chaurasia, Bhuvnesh Kumar

Abstract:

High performance of endurance in equid requires adaptive changes involving physio-biochemical, and molecular responses in an attempt to regain homeostasis. We hypothesized that the identification of the suitable reference genes might be considered for assessing of endurance related traits in pony at high altitude and may ensure for individuals struggling to potent endurance trait in ponies at high altitude. A total of 12 mares of ponies, Zanskar breed, were divided into three groups, group-A (without load), group-B, (60 Kg) and group-C (80 Kg) on backpack loads were subjected to a load carry protocol, on a steep climb of 4 km uphill, and of gravel, uneven rocky surface track at an altitude of 3292 m to 3500 m (endpoint). Blood was collected before and immediately after the load carry on sodium heparin anticoagulant, and the peripheral blood mononuclear cell was separated for total RNA isolation and thereafter cDNA synthesis. Real time-PCR reactions were carried out to evaluate the mRNAs expression profile of a panel of putative internal control genes (ICGs), related to different functional classes, namely glyceraldehyde 3-phosphate dehydrogenase (GAPDH), β₂ microglobulin (β₂M), β-actin (ACTB), ribosomal protein 18 (RS18), hypoxanthine-guanine phosophoribosyltransferase (HPRT), ubiquitin B (UBB), ribosomal protein L32 (RPL32), transferrin receptor protein (TFRC), succinate dehydrogenase complex subunit A (SDHA) for normalizing the real-time quantitative polymerase chain reaction (qPCR) data of native pony’s. Three different algorithms, geNorm, NormFinder, and BestKeeper software, were used to evaluate the stability of reference genes. The result showed that GAPDH was best stable gene and stability value for the best combination of two genes was observed TFRC and β₂M. In conclusion, the geometric mean of GAPDH, TFRC and β₂M might be used for accurate normalization of transcriptional data for assessing endurance related traits in Zanskar ponies during load carrying.

Keywords: endurance exercise, ubiquitin B (UBB), β₂ microglobulin (β₂M), high altitude, Zanskar ponies, reference gene

Procedia PDF Downloads 119
32863 An Introduction to Critical Chain Project Management Methodology

Authors: Ranjini Ramanath, Nanjunda P. Swamy

Abstract:

Construction has existed in our lives since time immemorial. However, unlike any other industry, construction projects have their own unique challenges – project type, purpose and end use of the project, geographical conditions, logistic arrangements, largely unorganized manpower and requirement of diverse skill sets, etc. These unique characteristics bring in their own level of risk and uncertainties to the project, which cause the project to deviate from its planned objectives of time, cost, quality, etc. over the many years, there have been significant developments in the way construction projects are conceptualized, planned, and managed. With the rapid increase in the population, increased rate of urbanization, there is a growing demand for infrastructure development, and it is required that the projects are delivered timely, and efficiently. In an age where ‘Time is Money,' implementation of new techniques of project management is required in leading to successful projects. This paper proposes a different approach to project management, which if applied in construction projects, can help in the accomplishment of the project objectives in a faster manner.

Keywords: critical chain project management methodology, critical chain, project management, construction management

Procedia PDF Downloads 407
32862 Artificial Intelligence in the Design of a Retaining Structure

Authors: Kelvin Lo

Abstract:

Nowadays, numerical modelling in geotechnical engineering is very common but sophisticated. Many advanced input settings and considerable computational efforts are required to optimize the design to reduce the construction cost. To optimize a design, it usually requires huge numerical models. If the optimization is conducted manually, there is a potentially dangerous consequence from human errors, and the time spent on the input and data extraction from output is significant. This paper presents an automation process introduced to numerical modelling (Plaxis 2D) of a trench excavation supported by a secant-pile retaining structure for a top-down tunnel project. Python code is adopted to control the process, and numerical modelling is conducted automatically in every 20m chainage along the 200m tunnel, with maximum retained height occurring in the middle chainage. Python code continuously changes the geological stratum and excavation depth under groundwater flow conditions in each 20m section. It automatically conducts trial and error to determine the required pile length and the use of props to achieve the required factor of safety and target displacement. Once the bending moment of the pile exceeds its capacity, it will increase in size. When the pile embedment reaches the default maximum length, it will turn on the prop system. Results showed that it saves time, increases efficiency, lowers design costs, and replaces human labor to minimize error.

Keywords: automation, numerical modelling, Python, retaining structures

Procedia PDF Downloads 39
32861 A Proposal of Ontology about Brazilian Government Transparency Portal

Authors: Estela Mayra de Moura Vianna, Thiago José Tavares Ávila, Bruno Morais Silva, Diego Henrique Bezerra, Paulo Henrique Gomes Silva, Alan Pedro da Silva

Abstract:

The Brazilian Federal Constitution defines the access to information as a crucial right of the citizen and the Law on Access to Public Information, which regulates this right. Accordingly, the Fiscal Responsibility Act, 2000, amended in 2009 by the “Law of Transparency”, began demanding a wider disclosure of public accounts for the society, including electronic media for public access. Thus, public entities began to create "Transparency Portals," which aim to gather a diversity of data and information. However, this information, in general, is still published in formats that do not simplify understanding of the data by citizens and that could be better especially available for audit purposes. In this context, a proposal of ontology about Brazilian Transparency Portal can play a key role in how these data will be better available. This study aims to identify and implement in ontology, the data model about Transparency Portal ecosystem, with emphasis in activities that use these data for some applications, like audits, press activities, social government control, and others.

Keywords: audit, government transparency, ontology, public sector

Procedia PDF Downloads 486