Search results for: interrupted time series designs
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20081

Search results for: interrupted time series designs

19631 Fashioning the Self: Femininity and Creativity in Vanity Case Design (1900-1970)

Authors: Sandy Ng

Abstract:

This study explores how women asserted their identities through vanity case design that increased their visibility and affirmed their sense of self. It discusses a history of vanity case design intersected with women as agents of social changes, represented in images that express the modern lifestyle created in the twentieth century. The discussion will emphasize how consumption transformed women’s appearances and mentality and whether vanity cases mediated the ways they perceived and presented themselves. Consumption in modern material culture imparted a sense of respectability to the modern woman while she introduced modern lifestyle and good taste through creativity and specific materials in the designs she carried that redefined her social status. Visual evidence, such as advertisements and photographs, as well as designs including vanity cases and jewelry worn by women, will be examined in the context of Pierre Bourdieu’s concept of cultural capital, enhancing understanding of women’s cultural and social roles in the modern era. Creativity articulated in fashion accessories specifically designed for women was instrumental in asserting their identities and ornamentation on design served vital cultural and social functions in the modern epoch.

Keywords: creativity, design, femininity, modern women, vanity case

Procedia PDF Downloads 61
19630 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes

Authors: Nadarajah I. Ramesh

Abstract:

Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.

Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model

Procedia PDF Downloads 256
19629 Method for Tuning Level Control Loops Based on Internal Model Control and Closed Loop Step Test Data

Authors: Arnaud Nougues

Abstract:

This paper describes a two-stage methodology derived from internal model control (IMC) for tuning a proportional-integral-derivative (PID) controller for levels or other integrating processes in an industrial environment. Focus is the ease of use and implementation speed which are critical for an industrial application. Tuning can be done with minimum effort and without the need for time-consuming open-loop step tests on the plant. The first stage of the method applies to levels only: the vessel residence time is calculated from equipment dimensions and used to derive a set of preliminary proportional-integral (PI) settings with IMC. The second stage, re-tuning in closed-loop, applies to levels as well as other integrating processes: a tuning correction mechanism has been developed based on a series of closed-loop simulations with model errors. The tuning correction is done from a simple closed-loop step test and the application of a generic correlation between observed overshoot and integral time correction. A spin-off of the method is that an estimate of the vessel residence time (levels) or open-loop process gain (other integrating process) is obtained from the closed-loop data.

Keywords: closed-loop model identification, IMC-PID tuning method, integrating process control, on-line PID tuning adaptation

Procedia PDF Downloads 197
19628 Using Variation Theory in a Design-based Approach to Improve Learning Outcomes of Teachers Use of Video and Live Experiments in Swedish Upper Secondary School

Authors: Andreas Johansson

Abstract:

Conceptual understanding needs to be grounded on observation of physical phenomena, experiences or metaphors. Observation of physical phenomena using demonstration experiments has a long tradition within physics education and students need to develop mental models to relate the observations to concepts from scientific theories. This study investigates how live and video experiments involving an acoustic trap to visualize particle-field interaction, field properties and particle properties can help develop students' mental models and how they can be used differently to realize their potential as teaching tools. Initially, they were treated as analogs and the lesson designs were kept identical. With a design-based approach, the experimental and video designs, as well as best practices for a respective teaching tool, were then developed in iterations. Variation theory was used as a theoretical framework to analyze the planned respective realized pattern of variation and invariance in order to explain learning outcomes as measured by a pre-posttest consisting of conceptual multiple-choice questions inspired by the Force Concept Inventory and the Force and Motion Conceptual Evaluation. Interviews with students and teachers were used to inform the design of experiments and videos in each iteration. The lesson designs and the live and video experiments has been developed to help teachers improve student learning and make school physics more interesting by involving experimental setups that usually are out of reach and to bridge the gap between what happens in classrooms and in science research. As students’ conceptual knowledge also rises their interest in physics the aim is to increase their chances of pursuing careers within science, technology, engineering or mathematics.

Keywords: acoustic trap, design-based research, experiments, variation theory

Procedia PDF Downloads 64
19627 Assessment of Climate Change Impacts on the Hydrology of Upper Guder Catchment, Upper Blue Nile

Authors: Fikru Fentaw Abera

Abstract:

Climate changes alter regional hydrologic conditions and results in a variety of impacts on water resource systems. Such hydrologic changes will affect almost every aspect of human well-being. The goal of this paper is to assess the impact of climate change on the hydrology of Upper Guder catchment located in northwest of Ethiopia. The GCM derived scenarios (HadCM3 A2a & B2a SRES emission scenarios) experiments were used for the climate projection. The statistical downscaling model (SDSM) was used to generate future possible local meteorological variables in the study area. The down-scaled data were then used as input to the soil and water assessment tool (SWAT) model to simulate the corresponding future stream flow regime in Upper Guder catchment of the Abay River Basin. A semi distributed hydrological model, SWAT was developed and Generalized Likelihood Uncertainty Estimation (GLUE) was utilized for uncertainty analysis. GLUE is linked with SWAT in the Calibration and Uncertainty Program known as SWAT-CUP. Three benchmark periods simulated for this study were 2020s, 2050s and 2080s. The time series generated by GCM of HadCM3 A2a and B2a and Statistical Downscaling Model (SDSM) indicate a significant increasing trend in maximum and minimum temperature values and a slight increasing trend in precipitation for both A2a and B2a emission scenarios in both Gedo and Tikur Inch stations for all three bench mark periods. The hydrologic impact analysis made with the downscaled temperature and precipitation time series as input to the hydrological model SWAT suggested for both A2a and B2a emission scenarios. The model output shows that there may be an annual increase in flow volume up to 35% for both emission scenarios in three benchmark periods in the future. All seasons show an increase in flow volume for both A2a and B2a emission scenarios for all time horizons. Potential evapotranspiration in the catchment also will increase annually on average 3-15% for the 2020s and 7-25% for the 2050s and 2080s for both A2a and B2a emissions scenarios.

Keywords: climate change, Guder sub-basin, GCM, SDSM, SWAT, SWAT-CUP, GLUE

Procedia PDF Downloads 337
19626 Landing Performance Improvement Using Genetic Algorithm for Electric Vertical Take Off and Landing Aircrafts

Authors: Willian C. De Brito, Hernan D. C. Munoz, Erlan V. C. Carvalho, Helder L. C. De Oliveira

Abstract:

In order to improve commute time for small distance trips and relieve large cities traffic, a new transport category has been the subject of research and new designs worldwide. The air taxi travel market promises to change the way people live and commute by using the concept of vehicles with the ability to take-off and land vertically and to provide passenger’s transport equivalent to a car, with mobility within large cities and between cities. Today’s civil air transport remains costly and accounts for 2% of the man-made CO₂ emissions. Taking advantage of this scenario, many companies have developed their own Vertical Take Off and Landing (VTOL) design, seeking to meet comfort, safety, low cost and flight time requirements in a sustainable way. Thus, the use of green power supplies, especially batteries, and fully electric power plants is the most common choice for these arising aircrafts. However, it is still a challenge finding a feasible way to handle with the use of batteries rather than conventional petroleum-based fuels. The batteries are heavy and have an energy density still below from those of gasoline, diesel or kerosene. Therefore, despite all the clear advantages, all electric aircrafts (AEA) still have low flight autonomy and high operational cost, since the batteries must be recharged or replaced. In this sense, this paper addresses a way to optimize the energy consumption in a typical mission of an aerial taxi aircraft. The approach and landing procedure was chosen to be the subject of an optimization genetic algorithm, while final programming can be adapted for take-off and flight level changes as well. A real tilt rotor aircraft with fully electric power plant data was used to fit the derived dynamic equations of motion. Although a tilt rotor design is used as a proof of concept, it is possible to change the optimization to be applied for other design concepts, even those with independent motors for hover and cruise flight phases. For a given trajectory, the best set of control variables are calculated to provide the time history response for aircraft´s attitude, rotors RPM and thrust direction (or vertical and horizontal thrust, for independent motors designs) that, if followed, results in the minimum electric power consumption through that landing path. Safety, comfort and design constraints are assumed to give representativeness to the solution. Results are highly dependent on these constraints. For the tested cases, performance improvement ranged from 5 to 10% changing initial airspeed, altitude, flight path angle, and attitude.

Keywords: air taxi travel, all electric aircraft, batteries, energy consumption, genetic algorithm, landing performance, optimization, performance improvement, tilt rotor, VTOL design

Procedia PDF Downloads 98
19625 Memory, Self, and Time: A Bachelardian Perspective

Authors: Michael Granado

Abstract:

The French philosopher Gaston Bachelard’s philosophy of time is articulated in his two works on the subject, the Intuition of the Instant (1932) and his The Dialectic of Duration (1936). Both works present a systematic methodology predicated upon the assumption that our understanding of time has radically changed as a result of Einstein and subsequently needs to be reimagined. Bachelard makes a major distinction in his discussion of time: 1. Time as it is (physical time), 2. Time as we experience it (phenomenological time). This paper will focus on the second distinction, phenomenological time, and explore the connections between Bachelard’s work and contemporary psychology. Several aspects of Bachelard’s philosophy of time nicely complement our current understanding of memory and self and clarify how the self relates to experienced time. Two points, in particular, stand out; the first is the relative nature of subjective time, and the second is the implications of subjective time in the formation of the narrative self. Bachelard introduces two philosophical concepts to explain these points: rhythmanalysis and reverie. By exploring these concepts, it will become apparent that there is an undeniable link between memory, self, and time. Through the use of narrative self, the individual connects and links memories and time together to form a sense of personal identity.

Keywords: Gaston Bachelard, memory, self, time

Procedia PDF Downloads 142
19624 Arc Interruption Design for DC High Current/Low SC Fuses via Simulation

Authors: Ali Kadivar, Kaveh Niayesh

Abstract:

This report summarizes a simulation-based approach to estimate the current interruption behavior of a fuse element utilized in a DC network protecting battery banks under different stresses. Due to internal resistance of the battries, the short circuit current in very close to the nominal current, and it makes the fuse designation tricky. The base configuration considered in this report consists of five fuse units in parallel. The simulations are performed using a multi-physics software package, COMSOL® 5.6, and the necessary material parameters have been calculated using two other software packages.The first phase of the simulation starts with the heating of the fuse elements resulted from the current flow through the fusing element. In this phase, the heat transfer between the metallic strip and the adjacent materials results in melting and evaporation of the filler and housing before the aluminum strip is evaporated and the current flow in the evaporated strip is cut-off, or an arc is eventually initiated. The initiated arc starts to expand, so the entire metallic strip is ablated, and a long arc of around 20 mm is created within the first 3 milliseconds after arc initiation (v_elongation = 6.6 m/s. The final stage of the simulation is related to the arc simulation and its interaction with the external circuitry. Because of the strong ablation of the filler material and venting of the arc caused by the melting and evaporation of the filler and housing before an arc initiates, the arc is assumed to burn in almost pure ablated material. To be able to precisely model this arc, one more step related to the derivation of the transport coefficients of the plasma in ablated urethane was necessary. The results indicate that an arc current interruption, in this case, will not be achieved within the first tens of milliseconds. In a further study, considering two series elements, the arc was interrupted within few milliseconds. A very important aspect in this context is the potential impact of many broken strips parallel to the one where the arc occurs. The generated arcing voltage is also applied to the other broken strips connected in parallel with arcing path. As the gap between the other strips is very small, a large voltage of a few hundred volts generated during the current interruption may eventually lead to a breakdown of another gap. As two arcs in parallel are not stable, one of the arcs will extinguish, and the total current will be carried by one single arc again. This process may be repeated several times if the generated voltage is very large. The ultimate result would be that the current interruption may be delayed.

Keywords: DC network, high current / low SC fuses, FEM simulation, paralle fuses

Procedia PDF Downloads 47
19623 Sea Surface Temperature and Climatic Variables as Drivers of North Pacific Albacore Tuna Thunnus Alalunga Time Series

Authors: Ashneel Ajay Singh, Naoki Suzuki, Kazumi Sakuramoto, Swastika Roshni, Paras Nath, Alok Kalla

Abstract:

Albacore tuna (Thunnus alalunga) is one of the commercially important species of tuna in the North Pacific region. Despite the long history of albacore fisheries in the Pacific, its ecological characteristics are not sufficiently understood. The effects of changing climate on numerous commercially and ecologically important fish species including albacore tuna have been documented over the past decades. The objective of this study was to explore and elucidate the relationship of environmental variables with the stock parameters of albacore tuna. The relationship of the North Pacific albacore tuna recruitment (R), spawning stock biomass (SSB) and recruits per spawning biomass (RPS) from 1970 to 2012 with the environmental factors of sea surface temperature (SST), Pacific decadal oscillation (PDO), El Niño southern oscillation (ENSO) and Pacific warm pool index (PWI) was construed. SST and PDO were used as independent variables with SSB to construct stock reproduction models for R and RPS as they showed most significant relationship with the dependent variables. ENSO and PWI were excluded due to collinearity effects with SST and PDO. Model selections were based on R2 values, Akaike Information Criterion (AIC) and significant parameter estimates at p<0.05. Models with single independent variables of SST, PDO, ENSO and PWI were also constructed to illuminate their individual effect on albacore R and RPS. From the results it can be said that SST and PDO resulted in the most significant models for reproducing North Pacific albacore tuna R and RPS time series. SST has the highest impact on albacore R and RPS when comparing models with single environmental variables. It is important for fishery managers and decision makers to incorporate the findings into their albacore tuna management plans for the North Pacific Oceanic region.

Keywords: Albacore tuna, El Niño southern oscillation, Pacific decadal oscillation, sea surface temperature

Procedia PDF Downloads 210
19622 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media

Procedia PDF Downloads 86
19621 Modelling and Simulation of Photovoltaic Cell

Authors: Fouad Berrabeh, Sabir Messalti

Abstract:

The performances of the photovoltaic systems are very dependent on different conditions, such as solar irradiation, temperature, etc. Therefore, it is very important to provide detailed studies for different cases in order to provide continuously power, so the photovoltaic system must be properly sized. This paper presents the modelling and simulation of the photovoltaic cell using single diode model. I-V characteristics and P-V characteristics are presented and it verified at different conditions (irradiance effect, temperature effect, series resistance effect).

Keywords: photovoltaic cell, BP SX 150 BP solar photovoltaic module, irradiance effect, temperature effect, series resistance effect, I–V characteristics, P–V characteristics

Procedia PDF Downloads 459
19620 A Series Solution of Fuzzy Integro-Differential Equation

Authors: Maryam Mosleh, Mahmood Otadi

Abstract:

The hybrid differential equations have a wide range of applications in science and engineering. In this paper, the homotopy analysis method (HAM) is applied to obtain the series solution of the hybrid differential equations. Using the homotopy analysis method, it is possible to find the exact solution or an approximate solution of the problem. Comparisons are made between improved predictor-corrector method, homotopy analysis method and the exact solution. Finally, we illustrate our approach by some numerical example.

Keywords: Fuzzy number, parametric form of a fuzzy number, fuzzy integrodifferential equation, homotopy analysis method

Procedia PDF Downloads 532
19619 Mirrors and Lenses: Multiple Views on Recognition in Holocaust Literature

Authors: Kirsten A. Bartels

Abstract:

There are a number of similarities between survivor literature and Holocaust fiction for children and young adults. The paper explores three facets of the parallels of recognition found specifically between Livia Bitton-Jackson’s memoir of her experience during the Holocaust as an inmate in Auschwitz, I Have Lived a Thousand Years (1999) and Morris Glietzman series of Holocaust fiction. While Bitton-Jackson reflects on her past and Glietzman designs a fictive character, both are judicious with what they are willing to impart, only providing information about their appearance or themselves when it impacts others or when it serves a necessary purpose to the story. Another similarity lies in another critical aspect of many works of Holocaust literature – the idea of being ‘representatively Jewish’. The authors come to this idea from different angles, perhaps best explained as the difference between showing and telling, for Bitton-Jackson provides personal details, and Gleitzman constructed Felix arguably with this idea in mind. Interwoven through their journeys is a shift in perspectives on being recognized -- from wanting to be seen as individuals to being seen as Jew. With this, being Jewish takes on different meaning, both youths struggle with being labeled as something they do not truly understand, and may have not truly identified with, from a label, to a death warrant. With survivor literature viewed as the most credible and worthwhile type of Holocaust literature and Holocaust fiction is often seen as the least (with children’s and young-adult being the lowest form) the similarities in approaches to telling the stories may go overlooked or be undervalued. This paper serves as an exploration in the some of parallel messages shared between the two.

Keywords: holocaust fiction, Holocaust literature, representatively Jewish, survivor literature

Procedia PDF Downloads 136
19618 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods

Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino

Abstract:

In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.

Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer

Procedia PDF Downloads 321
19617 Good Governance Complementary to Corruption Abatement: A Cross-Country Analysis

Authors: Kamal Ray, Tapati Bhattacharya

Abstract:

Private use of public office for private gain could be a tentative definition of corruption and most distasteful event of corruption is that it is not there, nor that it is pervasive, but it is socially acknowledged in the global economy, especially in the developing nations. We attempted to assess the interrelationship between the Corruption perception index (CPI) and the principal components of governance indicators as per World Bank like Control of Corruption (CC), rule of law (RL), regulatory quality (RQ) and government effectiveness (GE). Our empirical investigation concentrates upon the degree of reflection of governance indicators upon the CPI in order to single out the most powerful corruption-generating indicator in the selected countries. We have collected time series data on above governance indicators such as CC, RL, RQ and GE of the selected eleven countries from the year of 1996 to 2012 from World Bank data set. The countries are USA, UK, France, Germany, Greece, China, India, Japan, Thailand, Brazil, and South Africa. Corruption Perception Index (CPI) of the countries mentioned above for the period of 1996 to 2012is also collected. Graphical method of simple line diagram against the time series data on CPI is applied for quick view for the relative positions of different trend lines of different nations. The correlation coefficient is enough to assess primarily the degree and direction of association between the variables as we get the numerical data on governance indicators of the selected countries. The tool of Granger Causality Test (1969) is taken into account for investigating causal relationships between the variables, cause and effect to speak of. We do not need to verify stationary test as length of time series is short. Linear regression is taken as a tool for quantification of a change in explained variables due to change in explanatory variable in respect of governance vis a vis corruption. A bilateral positive causal link between CPI and CC is noticed in UK, index-value of CC increases by 1.59 units as CPI increases by one unit and CPI rises by 0.39 units as CC rises by one unit, and hence it has a multiplier effect so far as reduction in corruption is concerned in UK. GE causes strongly to the reduction of corruption in UK. In France, RQ is observed to be a most powerful indicator in reducing corruption whereas it is second most powerful indicator after GE in reducing of corruption in Japan. Governance-indicator like GE plays an important role to push down the corruption in Japan. In China and India, GE is proactive as well as influencing indicator to curb corruption. The inverse relationship between RL and CPI in Thailand indicates that ongoing machineries related to RL is not complementary to the reduction of corruption. The state machineries of CC in S. Africa are highly relevant to reduce the volume of corruption. In Greece, the variations of CPI positively influence the variations of CC and the indicator like GE is effective in controlling corruption as reflected by CPI. All the governance-indicators selected so far have failed to arrest their state level corruptions in USA, Germany and Brazil.

Keywords: corruption perception index, governance indicators, granger causality test, regression

Procedia PDF Downloads 289
19616 Stroke Rehabilitation via Electroencephalogram Sensors and an Articulated Robot

Authors: Winncy Du, Jeremy Nguyen, Harpinder Dhillon, Reinardus Justin Halim, Clayton Haske, Trent Hughes, Marissa Ortiz, Rozy Saini

Abstract:

Stroke often causes death or cerebro-vascular (CV) brain damage. Most patients with CV brain damage lost their motor control on their limbs. This paper focuses on developing a reliable, safe, and non-invasive EEG-based robot-assistant stroke rehabilitation system to help stroke survivors to rapidly restore their motor control functions for their limbs. An electroencephalogram (EEG) recording device (EPOC Headset) and was used to detect a patient’s brain activities. The EEG signals were then processed, classified, and interpreted to the motion intentions, and then converted to a series of robot motion commands. A six-axis articulated robot (AdeptSix 300) was employed to provide the intended motions based on these commends. To ensure the EEG device, the computer, and the robot can communicate to each other, an Arduino microcontroller is used to physically execute the programming codes to a series output pins’ status (HIGH or LOW). Then these “hardware” commends were sent to a 24 V relay to trigger the robot’s motion. A lookup table for various motion intensions and the associated EEG signal patterns were created (through training) and installed in the microcontroller. Thus, the motion intention can be direct determined by comparing the EEG patterns obtaibed from the patient with the look-up table’s EEG patterns; and the corresponding motion commends are sent to the robot to provide the intended motion without going through feature extraction and interpretation each time (a time-consuming process). For safety sake, an extender was designed and attached to the robot’s end effector to ensure the patient is beyond the robot’s workspace. The gripper is also designed to hold the patient’s limb. The test results of this rehabilitation system show that it can accurately interpret the patient’s motion intension and move the patient’s arm to the intended position.

Keywords: brain waves, EEG sensor, motion control, robot-assistant stroke rehabilitation

Procedia PDF Downloads 363
19615 Effect of Different Carbon Fabric Orientations on the Fracture Properties of Carbon Fabric Reinforced Polymer Composites

Authors: S. F. Halim, H. F. Naguib, S. N. Lawandy, R. S. Hegazy, M. N. Baheg

Abstract:

The main drawbacks of the traditional carbon fabric reinforced epoxy resin (CFRP) are low strain failure, delamination between composites layers, and low impact resistance due to the brittleness of epoxy resin. The aim of this study is to enhance the fracture properties of the CFRP composites laminates via the variation of composite's designs. A series of composites were fabricated in which bidirectional (00/900) carbon fabric (CF) layers were laid inside the resin matrix with orientation codes as F1 [(00, 900)/ (00, 900)], F2 [(900, 00)/ (00, 900)] and F3 [(00,900)/ (900, 00). The mechanical and dynamic properties of the composites were estimated. In addition, the morphology of samples surface was examined by scanning electron microscope (SEM) after impact fracture. The results revealed that the CFRP properties could be tailored fitting specific applications by controlling the fabric orientation inside the CFRP composite design. F2 orientation [(900, 00)/ (00.900)] showed the highest tensile and flexural strength values. On the other hand, the impact strength values of composites were in the order F1 > F2 > F3. The storage modulus, loss modulus, and glass transition temperature Tg values obtained from the dynamic mechanical analysis (DMA) examination was in the order F1 > F2 > F3. The variation in the properties of the composite was clearly explained by the SEM micrographs as the failure of F3 orientation properties was referred to as the complete breakage of the CF layers upon fracture.

Keywords: carbon fiber, CFRP, composites, epoxy resins, flexural strength

Procedia PDF Downloads 109
19614 Economic Analysis of Rainwater Harvesting Systems for Dairy Cattle

Authors: Sandra Cecilia Muhirirwe, Bart Van Der Bruggen, Violet Kisakye

Abstract:

Economic analysis of Rainwater harvesting (RWH) systems is vital in search of a cost-effective solution to water unreliability, especially in low-income countries. There is little literature focusing on the financial aspects of RWH for dairy farmers. The main purpose was to assess the economic viability of rainwater harvesting for diary framers in the Rwenzori region. The study focused on the use of rainwater harvesting systems from the rooftop and collection in above surface tanks. Daily rainfall time series for 12 years was obtained across nine gauging stations. The daily water balance equation was used for optimal sizing of the tank. Economic analysis of the investment was carried out based on the life cycle costs and the accruing benefits for the period of 15 years. Roof areas were varied from 75m2 as the minimum required area to 500m2 while maintaining the same number of cattle and keeping the daily water demand constant. The results show that the required rainwater tank sizes are very large and may be impractical to install due to the strongly varying terrain and the initial cost of investment. In all districts, there is a significant reduction of the volume of the required tank with an increasing collection area. The results further show that increasing the collection area has a minor effect on reducing the required tank size. Generally, for all rainfall areas, the reliability increases with an increase in the roof area. The results indicate that 100% reliability can only be realized with very large collection areas that are impractical to install. The estimated benefits outweigh the cost of investment. The Present Net Value shows that the investment is economically viable and investment with a short payback of a maximum of 3 years for all the time series in the study area.

Keywords: dairy cattle, optimisation, rainwater harvesting, economic analysis

Procedia PDF Downloads 174
19613 Fault Tolerant and Testable Designs of Reversible Sequential Building Blocks

Authors: Vishal Pareek, Shubham Gupta, Sushil Chandra Jain

Abstract:

With increasing high-speed computation demand the power consumption, heat dissipation and chip size issues are posing challenges for logic design with conventional technologies. Recovery of bit loss and bit errors is other issues that require reversibility and fault tolerance in the computation. The reversible computing is emerging as an alternative to conventional technologies to overcome the above problems and helpful in a diverse area such as low-power design, nanotechnology, quantum computing. Bit loss issue can be solved through unique input-output mapping which require reversibility and bit error issue require the capability of fault tolerance in design. In order to incorporate reversibility a number of combinational reversible logic based circuits have been developed. However, very few sequential reversible circuits have been reported in the literature. To make the circuit fault tolerant, a number of fault model and test approaches have been proposed for reversible logic. In this paper, we have attempted to incorporate fault tolerance in sequential reversible building blocks such as D flip-flop, T flip-flop, JK flip-flop, R-S flip-flop, Master-Slave D flip-flop, and double edge triggered D flip-flop by making them parity preserving. The importance of this proposed work lies in the fact that it provides the design of reversible sequential circuits completely testable for any stuck-at fault and single bit fault. In our opinion our design of reversible building blocks is superior to existing designs in term of quantum cost, hardware complexity, constant input, garbage output, number of gates and design of online testable D flip-flop have been proposed for the first time. We hope our work can be extended for building complex reversible sequential circuits.

Keywords: parity preserving gate, quantum computing, fault tolerance, flip-flop, sequential reversible logic

Procedia PDF Downloads 526
19612 Environmental Impact Assessment in Mining Regions with Remote Sensing

Authors: Carla Palencia-Aguilar

Abstract:

Calculations of Net Carbon Balance can be obtained by means of Net Biome Productivity (NBP), Net Ecosystem Productivity (NEP), and Net Primary Production (NPP). The latter is an important component of the biosphere carbon cycle and is easily obtained data from MODIS MOD17A3HGF; however, the results are only available yearly. To overcome data availability, bands 33 to 36 from MODIS MYD021KM (obtained on a daily basis) were analyzed and compared with NPP data from the years 2000 to 2021 in 7 sites where surface mining takes place in the Colombian territory. Coal, Gold, Iron, and Limestone were the minerals of interest. Scales and Units as well as thermal anomalies, were considered for net carbon balance per location. The NPP time series from the satellite images were filtered by using two Matlab filters: First order and Discrete Transfer. After filtering the NPP time series, comparing the graph results from the satellite’s image value, and running a linear regression, the results showed R2 from 0,72 to 0,85. To establish comparable units among NPP and bands 33 to 36, the Greenhouse Gas Equivalencies Calculator by EPA was used. The comparison was established in two ways: one by the sum of all the data per point per year and the other by the average of 46 weeks and finding the percentage that the value represented with respect to NPP. The former underestimated the total CO2 emissions. The results also showed that coal and gold mining in the last 22 years had less CO2 emissions than limestone, with an average per year of 143 kton CO2 eq for gold, 152 kton CO2 eq for coal, and 287 kton CO2 eq for iron. Limestone emissions varied from 206 to 441 kton CO2 eq. The maximum emission values from unfiltered data correspond to 165 kton CO2 eq. for gold, 188 kton CO2 eq. for coal, and 310 kton CO2 eq. for iron and limestone, varying from 231 to 490 kton CO2 eq. If the most pollutant limestone site improves its production technology, limestone could count with a maximum of 318 kton CO2 eq emissions per year, a value very similar respect to iron. The importance of gathering data is to establish benchmarks in order to attain 2050’s zero emissions goal.

Keywords: carbon dioxide, NPP, MODIS, MINING

Procedia PDF Downloads 77
19611 Evaluating the Effectiveness of Plantar Sensory Insoles and Remote Patient Monitoring for Early Intervention in Diabetic Foot Ulcer Prevention in Patients with Peripheral Neuropathy

Authors: Brock Liden, Eric Janowitz

Abstract:

Introduction: Diabetic peripheral neuropathy (DPN) affects 70% of individuals with diabetes1. DPN causes a loss of protective sensation, which can lead to tissue damage and diabetic foot ulcer (DFU) formation2. These ulcers can result in infections and lower-extremity amputations of toes, the entire foot, and the lower leg. Even after a DFU is healed, recurrence is common, with 49% of DFU patients developing another ulcer within a year and 68% within 5 years3. This case series examines the use of sensory insoles and newly available plantar data (pressure, temperature, step count, adherence) and remote patient monitoring in patients at risk of DFU. Methods: Participants were provided with custom-made sensory insoles to monitor plantar pressure, temperature, step count, and daily use and were provided with real-time cues for pressure offloading as they went about their daily activities. The sensory insoles were used to track subject compliance, ulceration, and response to feedback from real-time alerts. Patients were remotely monitored by a qualified healthcare professional and were contacted when areas of concern were seen and provided coaching on reducing risk factors and overall support to improve foot health. Results: Of the 40 participants provided with the sensory insole system, 4 presented with a DFU. Based on flags generated from the available plantar data, patients were contacted by the remote monitor to address potential concerns. A standard clinical escalation protocol detailed when and how concerns should be escalated to the provider by the remote monitor. Upon escalation to the provider, patients were brought into the clinic as needed, allowing for any issues to be addressed before more serious complications might arise. Conclusion: This case series explores the use of innovative sensory technology to collect plantar data (pressure, temperature, step count, and adherence) for DFU detection and early intervention. The results from this case series suggest the importance of sensory technology and remote patient monitoring in providing proactive, preventative care for patients at risk of DFU. This robust plantar data, with the addition of remote patient monitoring, allow for patients to be seen in the clinic when concerns arise, giving providers the opportunity to intervene early and prevent more serious complications, such as wounds, from occurring.

Keywords: diabetic foot ulcer, DFU prevention, digital therapeutics, remote patient monitoring

Procedia PDF Downloads 57
19610 Clinicopathological Characteristics in Male Breast Cancer: A Case Series and Literature Review

Authors: Mohamed Shafi Mahboob Ali

Abstract:

Male breast cancer (MBC) is a rare entity with overall cases reported less than 1%. However, the incidence of MBC is regularly rising every year. Due to the lack of data on MBC, diagnosis and treatment are tailored to female breast cancer. MBC risk increases with age and is usually diagnosed ten years late as the disease progression is slow compared to female breast cancer (FBC). The most common feature of MBC is an intra-ductal variant, and often, upon diagnosis, the stage of the disease is already advanced. The Prognosis of MBC is often flawed, but new treatment modalities are emerging with the current knowledge and advancement. We presented a series of male breast cancer in our center, highlighting the clinicopathological, radiological and treatment options.

Keywords: male, breast, cancer, clinicopathology, ultrasound, CT scan

Procedia PDF Downloads 75
19609 Comparative Studies of Modified Clay/Polyaniline Nanocomposites

Authors: Fatima Zohra Zeggai, Benjamin Carbonnier, Aïcha Hachemaoui, Ahmed Yahiaoui, Samia Mahouche-Chergui, Zakaria Salmi

Abstract:

A series of polyaniline (PANI)/modified Montmorillonite (MMT) Clay nanocomposite materials have been successfully prepared by In-Situ polymerization in the presence of modified MMT-Clay or Diazonium-MMT-Clay. The obtained nanocomposites were characterized and compared by various physicochemical techniques. The presence of physicochemical interaction, probably hydrogen bonding, between clay and polyaniline, which was confirmed by FTIR, UV-Vis Spectroscopy. The electrical conductivity of neat PANI and a series of the obtained nanocomposites were also studied by cyclic voltammograms.

Keywords: polyaniline, clay, nanocomposites, in-situ polymerization, polymers conductors, diazonium salt

Procedia PDF Downloads 447
19608 Springback Prediction for Sheet Metal Cold Stamping Using Convolutional Neural Networks

Authors: Lei Zhu, Nan Li

Abstract:

Cold stamping has been widely applied in the automotive industry for the mass production of a great range of automotive panels. Predicting the springback to ensure the dimensional accuracy of the cold-stamped components is a critical step. The main approaches for the prediction and compensation of springback in cold stamping include running Finite Element (FE) simulations and conducting experiments, which require forming process expertise and can be time-consuming and expensive for the design of cold stamping tools. Machine learning technologies have been proven and successfully applied in learning complex system behaviours using presentative samples. These technologies exhibit the promising potential to be used as supporting design tools for metal forming technologies. This study, for the first time, presents a novel application of a Convolutional Neural Network (CNN) based surrogate model to predict the springback fields for variable U-shape cold bending geometries. A dataset is created based on the U-shape cold bending geometries and the corresponding FE simulations results. The dataset is then applied to train the CNN surrogate model. The result shows that the surrogate model can achieve near indistinguishable full-field predictions in real-time when compared with the FE simulation results. The application of CNN in efficient springback prediction can be adopted in industrial settings to aid both conceptual and final component designs for designers without having manufacturing knowledge.

Keywords: springback, cold stamping, convolutional neural networks, machine learning

Procedia PDF Downloads 125
19607 Thermal Analysis of Friction Stir Welded Dissimilar Materials with Different Preheating Conditions

Authors: Prashant S. Humnabad

Abstract:

The objective of this work is to carry out a thermal heat transfer analysis to obtain the time dependent temperature field in welding process friction stir welded dissimilar materials with different preheating temperature. A series of joints were made on four mm thick aluminum and steel plates. The temperature used was 100ºC, 150ºC and 200ºC. The welding operation was performed with different rotational speeds and traverse speed (1000, 1400 and 2000 rmp and 16, 20 and 25 mm/min..). In numerical model, the welded plate was modeled as the weld line is the symmetric line. The work-piece has dimensions of 100x100x4 mm. The obtained result was compared with experimental result, which shows good agreement and within the acceptable limit. The peak temperature at the weld zone increases significantly with respect to increase in process time.

Keywords: FEA, thermal analysis, preheating, friction stir welding

Procedia PDF Downloads 171
19606 Effect of Diazepam on Internal Organs of Chrysomya megacephala Using Micro-Computed Tomograph

Authors: Sangkhao M., Butcher B. A.

Abstract:

Diazepam (known as valium) is a medication for calming effect. Many reports on committed suicide cases shown that diazepam is frequently used for this purpose. This research aims to study effect of diazepam on the development of forensically important blowflies, Chrysomya megacephala (Diptera: Calliphoridae) using micro-computed tomography (micro CT). In this study, four rabbits were treated with three different lethal doses of diazepam and one control (LD₀, LD₅₀, LD₁₀₀ and LC). The rabbit’s livers were removed for rearing the blowflies. Pupae were sampled for two series (ages; S1: 24h and S2: 120h) of development. After preparing the specimens, all samples were performed Micro CT using Skyscan 1172. The results shown the effect of diazepam on internal organs and tissues such as brain, cavity of the body, gas bubble, meconium and especially fat body. In the control group, in series 1 (LCS1), fat body was equally dispersed in the head, thorax, and abdomen, development of internal organs were not completed, however, brain, thoracic muscle, wings, legs and rectum were able to observe at 24h after developing into the pupal stage. Development of each organ in the control group in the series two was completed. In the treatment groups, LD₀, LD₅₀, LD₁₀₀ (Series 1 and Series 2), tissues are different, such as gas bubble in LD₀S1, was observed due to rapidity morphological changes during the metamorphosis of blowfly’s pupa in this treatment. Meconium was observed in LD₅₀S2 group because excretion of metabolic waste was not completed. All of the samples in the treatment groups had differentiation of fat bodies because metabolic activities were not completed and these changes affected on functions of every internal system. Discovering of differentiated fat bodies are important results because fat bodies of insect functions as liver in human, therefore it is shown that toxin eliminates from blowfly’s body and homeostatic maintenance of the hemolymph proteins, lipid and carbohydrates in each treatment group are abnormal.

Keywords: forensic toxicology, forensic entomology, diptera, diazepam

Procedia PDF Downloads 114
19605 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

Authors: U. Datta

Abstract:

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection

Procedia PDF Downloads 114
19604 A Data-Mining Model for Protection of FACTS-Based Transmission Line

Authors: Ashok Kalagura

Abstract:

This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.

Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC

Procedia PDF Downloads 410
19603 A Hybrid Algorithm Based on Greedy Randomized Adaptive Search Procedure and Chemical Reaction Optimization for the Vehicle Routing Problem with Hard Time Windows

Authors: Imen Boudali, Marwa Ragmoun

Abstract:

The Vehicle Routing Problem with Hard Time Windows (VRPHTW) is a basic distribution management problem that models many real-world problems. The objective of the problem is to deliver a set of customers with known demands on minimum-cost vehicle routes while satisfying vehicle capacity and hard time windows for customers. In this paper, we propose to deal with our optimization problem by using a new hybrid stochastic algorithm based on two metaheuristics: Chemical Reaction Optimization (CRO) and Greedy Randomized Adaptive Search Procedure (GRASP). The first method is inspired by the natural process of chemical reactions enabling the transformation of unstable substances with excessive energy to stable ones. During this process, the molecules interact with each other through a series of elementary reactions to reach minimum energy for their existence. This property is embedded in CRO to solve the VRPHTW. In order to enhance the population diversity throughout the search process, we integrated the GRASP in our method. Simulation results on the base of Solomon’s benchmark instances show the very satisfactory performances of the proposed approach.

Keywords: Benchmark Problems, Combinatorial Optimization, Vehicle Routing Problem with Hard Time Windows, Meta-heuristics, Hybridization, GRASP, CRO

Procedia PDF Downloads 385
19602 Serial Position Curves under Compressively Expanding and Contracting Schedules of Presentation

Authors: Priya Varma, Denis John McKeown

Abstract:

Psychological time, unlike physical time, is believed to be ‘compressive’ in the sense that the mental representations of a series of events may be internally arranged with ever decreasing inter-event spacing (looking back from the most recently encoded event). If this is true, the record within immediate memory of recent events is severely temporally distorted. Although this notion of temporal distortion of the memory record is captured within some theoretical accounts of human forgetting, notably temporal distinctiveness accounts, the way in which the fundamental nature of the distortion underpins memory and forgetting broadly is barely recognised or at least directly investigated. Our intention here was to manipulate the spacing of items for recall in order to ‘reverse’ this supposed natural compression within the encoding of the items. In Experiment 1 three schedules of presentation (expanding, contracting and fixed irregular temporal spacing) were created using logarithmic spacing of the words for both free and serial recall conditions. The results of recall of lists of 7 words showed statistically significant benefits of temporal isolation, and more excitingly the contracting word series (which we may think of as reversing the natural compression within the mental representation of the word list) showed best performance. Experiment 2 tested for effects of active verbal rehearsal in the recall task; this reduced but did not remove the benefits of our temporal scheduling manipulation. Finally, a third experiment used the same design but with Chinese characters as memoranda, in a further attempt to subvert possible verbal maintenance of items. One change to the design here was to introduce a probe item following the sequence of items and record response times to this probe. Together the outcomes of the experiments broadly support the notion of temporal compression within immediate memory.

Keywords: memory, serial position curves, temporal isolation, temporal schedules

Procedia PDF Downloads 196