Search results for: time history response analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 44081

Search results for: time history response analysis

37691 Optimisation of B2C Supply Chain Resource Allocation

Authors: Firdaous Zair, Zoubir Elfelsoufi, Mohammed Fourka

Abstract:

The allocation of resources is an issue that is needed on the tactical and operational strategic plan. This work considers the allocation of resources in the case of pure players, manufacturers and Click & Mortars that have launched online sales. The aim is to improve the level of customer satisfaction and maintaining the benefits of e-retailer and of its cooperators and reducing costs and risks. Our contribution is a decision support system and tool for improving the allocation of resources in logistics chains e-commerce B2C context. We first modeled the B2C chain with all operations that integrates and possible scenarios since online retailers offer a wide selection of personalized service. The personalized services that online shopping companies offer to the clients can be embodied in many aspects, such as the customizations of payment, the distribution methods, and after-sales service choices. In addition, every aspect of customized service has several modes. At that time, we analyzed the optimization problems of supply chain resource allocation in customized online shopping service mode, which is different from the supply chain resource allocation under traditional manufacturing or service circumstances. Then we realized an optimization model and algorithm for the development based on the analysis of the allocation of the B2C supply chain resources. It is a multi-objective optimization that considers the collaboration of resources in operations, time and costs but also the risks and the quality of services as well as dynamic and uncertain characters related to the request.

Keywords: e-commerce, supply chain, B2C, optimisation, resource allocation

Procedia PDF Downloads 278
37690 A Model for Analysing Argumentative Structures and Online Deliberation in User-Generated Comments to the Website of a South African Newspaper

Authors: Marthinus Conradie

Abstract:

The conversational dynamics of democratically orientated deliberation continue to stimulate critical scholarship for its potential to bolster robust engagement between different sections of pluralist societies. Several axes of deliberation that have attracted academic attention include face-to-face vs. online interaction, and citizen-to-citizen communication vs. engagement between citizens and political elites. In all these areas, numerous researchers have explored deliberative procedures aimed at achieving instrumental goals such a securing consensus on policy issues, against procedures that prioritise expressive outcomes such as broadening the range of argumentative repertoires that discursively construct and mediate specific political issues. The study that informs this paper, works in the latter stream. Drawing its data from the reader-comments section of a South African broadsheet newspaper, the study investigates online, citizen-to-citizen deliberation by analysing the discursive practices through which competing understandings of social problems are articulated and contested. To advance this agenda, the paper deals specifically with user-generated comments posted in response to news stories on questions of race and racism in South Africa. The analysis works to discern and interpret the various sets of discourse practices that shape how citizens deliberate contentious political issues, especially racism. Since the website in question is designed to encourage the critical comparison of divergent interpretations of news events, without feeding directly into national policymaking, the study adopts an analytic framework that traces how citizens articulate arguments, rather than the instrumental effects that citizen deliberations might exert on policy. The paper starts from the argument that such expressive interactions are particularly crucial to current trends in South African politics, given that the precise nature of race and racism remain contested and uncertain. Centred on a sample of 2358 conversational moves in 814 posts to 18 news stories emanating from issues of race and racism, the analysis proceeds in a two-step fashion. The first stage conducts a qualitative content analysis that offers insights into the levels of reciprocity among commenters (do readers engage with each other or simply post isolated opinions?), as well as the structures of argumentation (do readers support opinions by citing evidence?). The second stage involves a more fine-grained discourse analysis, based on a theorisation of argumentation that delineates it into three components: opinions/conclusions, evidence/data to support opinions/conclusions and warrants that explicate precisely how evidence/data buttress opinions/conclusions. By tracing the manifestation and frequency of specific argumentative practices, this study contributes to the archive of research currently aggregating around the practices that characterise South Africans’ engagement with provocative political questions, especially racism and racial inequity. Additionally, the study also contributes to recent scholarship on the affordances of Web 2.0 software by eschewing a simplistic bifurcation between cyber-optimist vs. pessimism, in favour of a more nuanced and context-specific analysis of the patterns that structure online deliberation.

Keywords: online deliberation, discourse analysis, qualitative content analysis, racism

Procedia PDF Downloads 183
37689 The Development of Congeneric Elicited Writing Tasks to Capture Language Decline in Alzheimer Patients

Authors: Lise Paesen, Marielle Leijten

Abstract:

People diagnosed with probable Alzheimer disease suffer from an impairment of their language capacities; a gradual impairment which affects both their spoken and written communication. Our study aims at characterising the language decline in DAT patients with the use of congeneric elicited writing tasks. Within these tasks, a descriptive text has to be written based upon images with which the participants are confronted. A randomised set of images allows us to present the participants with a different task on every encounter, thus allowing us to avoid a recognition effect in this iterative study. This method is a revision from previous studies, in which participants were presented with a larger picture depicting an entire scene. In order to create the randomised set of images, existing pictures were adapted following strict criteria (e.g. frequency, AoA, colour, ...). The resulting data set contained 50 images, belonging to several categories (vehicles, animals, humans, and objects). A pre-test was constructed to validate the created picture set; most images had been used before in spoken picture naming tasks. Hence the same reaction times ought to be triggered in the typed picture naming task. Once validated, the effectiveness of the descriptive tasks was assessed. First, the participants (n=60 students, n=40 healthy elderly) performed a typing task, which provided information about the typing speed of each individual. Secondly, two descriptive writing tasks were carried out, one simple and one complex. The simple task contains 4 images (1 animal, 2 objects, 1 vehicle) and only contains elements with high frequency, a young AoA (<6 years), and fast reaction times. Slow reaction times, a later AoA (≥ 6 years) and low frequency were criteria for the complex task. This task uses 6 images (2 animals, 1 human, 2 objects and 1 vehicle). The data were collected with the keystroke logging programme Inputlog. Keystroke logging tools log and time stamp keystroke activity to reconstruct and describe text production processes. The data were analysed using a selection of writing process and product variables, such as general writing process measures, detailed pause analysis, linguistic analysis, and text length. As a covariate, the intrapersonal interkey transition times from the typing task were taken into account. The pre-test indicated that the new images lead to similar or even faster reaction times compared to the original images. All the images were therefore used in the main study. The produced texts of the description tasks were significantly longer compared to previous studies, providing sufficient text and process data for analyses. Preliminary analysis shows that the amount of words produced differed significantly between the healthy elderly and the students, as did the mean length of production bursts, even though both groups needed the same time to produce their texts. However, the elderly took significantly more time to produce the complex task than the simple task. Nevertheless, the amount of words per minute remained comparable between simple and complex. The pauses within and before words varied, even when taking personal typing abilities (obtained by the typing task) into account.

Keywords: Alzheimer's disease, experimental design, language decline, writing process

Procedia PDF Downloads 280
37688 Evaluation of Reliability Flood Control System Based on Uncertainty of Flood Discharge, Case Study Wulan River, Central Java, Indonesia

Authors: Anik Sarminingsih, Krishna V. Pradana

Abstract:

The failure of flood control system can be caused by various factors, such as not considering the uncertainty of designed flood causing the capacity of the flood control system is exceeded. The presence of the uncertainty factor is recognized as a serious issue in hydrological studies. Uncertainty in hydrological analysis is influenced by many factors, starting from reading water elevation data, rainfall data, selection of method of analysis, etc. In hydrological modeling selection of models and parameters corresponding to the watershed conditions should be evaluated by the hydraulic model in the river as a drainage channel. River cross-section capacity is the first defense in knowing the reliability of the flood control system. Reliability of river capacity describes the potential magnitude of flood risk. Case study in this research is Wulan River in Central Java. This river occurring flood almost every year despite some efforts to control floods such as levee, floodway and diversion. The flood-affected areas include several sub-districts, mainly in Kabupaten Kudus and Kabupaten Demak. First step is analyze the frequency of discharge observation from Klambu weir which have time series data from 1951-2013. Frequency analysis is performed using several distribution frequency models such as Gumbel distribution, Normal, Normal Log, Pearson Type III and Log Pearson. The result of the model based on standard deviation overlaps, so the maximum flood discharge from the lower return periods may be worth more than the average discharge for larger return periods. The next step is to perform a hydraulic analysis to evaluate the reliability of river capacity based on the flood discharge resulted from several methods. The selection of the design flood discharge of flood control system is the result of the method closest to bankfull capacity of the river.

Keywords: design flood, hydrological model, reliability, uncertainty, Wulan river

Procedia PDF Downloads 297
37687 An Integrated Approach for Optimizing Drillable Parameters to Increase Drilling Performance: A Real Field Case Study

Authors: Hamidoddin Yousife

Abstract:

Drilling optimization requires a prediction of drilling rate of penetration (ROP) since it provides a significant reduction in drilling costs. There are several factors that can have an impact on the ROP, both controllable and uncontrollable. Numerous drilling penetration rate models have been considered based on drilling parameters. This papers considered the effect of proper drilling parameter selection such as bit, Mud Type, applied weight on bit (WOB), Revolution per minutes (RPM), and flow rate on drilling optimization and drilling cost reduction. A predicted analysis is used in real-time drilling performance to determine the optimal drilling operation. As a result of these modeling studies, the real data collected from three directional wells at Azadegan oil fields, Iran, was verified and adjusted to determine the drillability of a specific formation. Simulation results and actual drilling results show significant improvements in inaccuracy. Once simulations had been validated, optimum drilling parameters and equipment specifications were determined by varying weight on bit (WOB), rotary speed (RPM), hydraulics (hydraulic pressure), and bit specification for each well until the highest drilling rate was achieved. To evaluate the potential operational and economic benefits of optimizing results, a qualitative and quantitative analysis of the data was performed.

Keywords: drlling, cost, optimization, parameters

Procedia PDF Downloads 173
37686 Progression of Myopia in School Going Children During COVID Era

Authors: Sony Singh M. Optom, Vivekananda U. Warkad, Debasmita Majhi

Abstract:

Purpose: The purpose is to observe the progression of myopia in school-aged children during the COVID-19 era, with home confinement having high exposure to screen time and fewer outdoor activities. Method: A Retrospective analysis was done for all mild, moderate, and high myopic school-going children who presented to L V Prasad Eye Institute (MTC- campus) from December 2019 to March 2021 with minimum 2 follow-ups (6 months and 1 year follow-up) with mean age group of 11.47+/-2.73 and refractive error at presentation was OD 2.31+/-1.66 in OD and 2.375+/-1.83 in OS and mean BCVA (OD)0.32+/-0.06, (OS) 0.31+/-0.06. The refractive error on the last follow-up was 3.23+/-1.71 in OD and 3.30+/-1.90 in OS, and the mean BCVA was 0.013+/-0.039 in OD and 0.015+/-0.043 in OS. Altogether 131 patients’ data were analyzed who adhered to our inclusion and exclusion criteria, and a questionnaire was designed regarding the average screen-time exposure where all the parents were asked either face-to-face or were called over the phone to give feedback. Mean spherical values and annual myopia progression based on gender, age, severity of myopia, and interview data, which was analyzed by Kruskal Wallis test, and Mann Whitney test. Conclusion: When compared based on the severity of myopia, myopia progression was found more in emmetropes rather than mild, moderate and high myopes and was statistically significant with p p-value of <0.001. 69% of subjects who were found using mobile phones for more than 4 hours per day had myopia progression by 0.75D, which was statistically significant (p-value <0.001) as compared to those who didn’t attend online classes (myopia progression was by -0.25D.

Keywords: myopia, school going children, annual progression, COVID ERA

Procedia PDF Downloads 12
37685 Preservation Model to Process 'La Bomba Del Chota' as a Living Cultural Heritage

Authors: Lucia Carrion Gordon, Maria Gabriela Lopez Yanez

Abstract:

This project focuses on heritage concepts and their importance in every evolving and changing Digital Era where system solutions have to be sustainable, efficient and suitable to the basic needs. The prototype has to cover the principal requirements for the case studies. How to preserve the sociological ideas of dances in Ecuador like ‘La Bomba’ is the best example and challenge to preserve the intangible data. The same idea is applicable with books and music. The History and how to keep it, is the principal mission of Heritage Preservation. The dance of La Bomba is rooted on a specific movement system whose main part is the sideward hip movement. La Bomba´s movement system is the surface manifestation of a whole system of knowledge whose principal characteristics are the historical relation of Chote˜nos with their land and their families.

Keywords: digital preservation, heritage, IT management, data, metadata, ontology, serendipity

Procedia PDF Downloads 392
37684 An Algorithm for Determining the Arrival Behavior of a Secondary User to a Base Station in Cognitive Radio Networks

Authors: Danilo López, Edwin Rivas, Leyla López

Abstract:

This paper presents the development of an algorithm that predicts the arrival of a secondary user (SU) to a base station (BS) in a cognitive network based on infrastructure, requesting a Best Effort (BE) or Real Time (RT) type of service with a determined bandwidth (BW) implementing neural networks. The algorithm dynamically uses a neural network construction technique using the geometric pyramid topology and trains a Multilayer Perceptron Neural Networks (MLPNN) based on the historical arrival of an SU to estimate future applications. This will allow efficiently managing the information in the BS, since it precedes the arrival of the SUs in the stage of selection of the best channel in CRN. As a result, the software application determines the probability of arrival at a future time point and calculates the performance metrics to measure the effectiveness of the predictions made.

Keywords: cognitive radio, base station, best effort, MLPNN, prediction, real time

Procedia PDF Downloads 334
37683 COVID-19 Pandemic Influence on Toddlers and Preschoolers’ Screen Time

Authors: Juliana da Silva Cardoso, Cláudia Correia, Rita Gomes, Carolina Fraga, Inês Cascais, Sara Monteiro, Beatriz Teixeira, Sandra Ribeiro, Carolina Andrade, Cláudia Oliveira, Diana Gonzaga, Catarina Prior, Inês Vaz Matos

Abstract:

The average daily screen time (ST) has been increasing in children, even at young ages. This seems to be associated with a higher incidence of neurodevelopmental disorders, and as the time of exposure increases, the greater is the functional impact. This study aims to compare the daily ST of toddlers and preschoolers previously and during the COVID-19 pandemic. A questionnaire was applied by telephone to parents/caregivers of children between 1 and 5 years old, followed up at 4 primary care units belonging to the Group of Primary Health Care Centers of Western Porto, Portugal. 520 children were included: 52.9% male, mean age 39.4 ± 13.9 months. The mean age of first exposure to screens was 13.9 ± 8.0 months, and most of the children were exposed to more than one screen daily. Considering the WHO recommendations, before the COVID-19 pandemic, 385 (74.0%) and 408 (78.5%) children had excessive ST during the week and the weekend, respectively; during the lockdown, these values increased to 495 (95.2%) and 482 (92.7%). Maternal education and both the child's median age and the median age of first exposure to screens had a statistically significant association with excessive ST, with OR 0.2 (p = 0.03, CI 95% 0.07-0.86), OR 1.1 (p = 0.01, 95% CI 1.05-1.14) and OR 0.9 (p = 0.05, 95% CI 0. 87-0.98), respectively. Most children in this sample had a higher than recommended ST, which increased with the onset of the COVID-19 pandemic. These results are worrisome and point to the need for urgent intervention.

Keywords: COVID-19 pandemic, preschoolers, screen time, toddlers

Procedia PDF Downloads 223
37682 Real-Time Recognition of Dynamic Hand Postures on a Neuromorphic System

Authors: Qian Liu, Steve Furber

Abstract:

To explore how the brain may recognize objects in its general,accurate and energy-efficient manner, this paper proposes the use of a neuromorphic hardware system formed from a Dynamic Video Sensor~(DVS) silicon retina in concert with the SpiNNaker real-time Spiking Neural Network~(SNN) simulator. As a first step in the exploration on this platform a recognition system for dynamic hand postures is developed, enabling the study of the methods used in the visual pathways of the brain. Inspired by the behaviours of the primary visual cortex, Convolutional Neural Networks (CNNs) are modeled using both linear perceptrons and spiking Leaky Integrate-and-Fire (LIF) neurons. In this study's largest configuration using these approaches, a network of 74,210 neurons and 15,216,512 synapses is created and operated in real-time using 290 SpiNNaker processor cores in parallel and with 93.0% accuracy. A smaller network using only 1/10th of the resources is also created, again operating in real-time, and it is able to recognize the postures with an accuracy of around 86.4% -only 6.6% lower than the much larger system. The recognition rate of the smaller network developed on this neuromorphic system is sufficient for a successful hand posture recognition system, and demonstrates a much-improved cost to performance trade-off in its approach.

Keywords: spiking neural network (SNN), convolutional neural network (CNN), posture recognition, neuromorphic system

Procedia PDF Downloads 479
37681 Cost-Effectiveness of a Certified Service or Hearing Dog Compared to a Regular Companion Dog

Authors: Lundqvist M., Alwin J., Levin L-A.

Abstract:

Background: Assistance dogs are dogs trained to assist persons with functional impairment or chronic diseases. The assistance dog concept includes different types: guide dogs, hearing dogs, and service dogs. The service dog can further be divided into subgroups of physical services dogs, diabetes alert dogs, and seizure alert dogs. To examine the long-term effects of health care interventions, both in terms of resource use and health outcomes, cost-effectiveness analyses can be conducted. This analysis can provide important input to decision-makers when setting priorities. Little is known when it comes to the cost-effectiveness of assistance dogs. The study aimed to assess the cost-effectiveness of certified service or hearing dogs in comparison to regular companion dogs. Methods: The main data source for the analysis was the “service and hearing dog project”. It was a longitudinal interventional study with a pre-post design that incorporated fifty-five owners and their dogs. Data on all relevant costs affected by the use of a service dog such as; municipal services, health care costs, costs of sick leave, and costs of informal care were collected. Health-related quality of life was measured with the standardized instrument EQ-5D-3L. A decision-analytic Markov model was constructed to conduct the cost-effectiveness analysis. Outcomes were estimated over a 10-year time horizon. The incremental cost-effectiveness ratio expressed as cost per gained quality-adjusted life year was the primary outcome. The analysis employed a societal perspective. Results: The result of the cost-effectiveness analysis showed that compared to a regular companion dog, a certified dog is cost-effective with both lower total costs [-32,000 USD] and more quality-adjusted life-years [0.17]. Also, we will present subgroup results analyzing the cost-effectiveness of physicals service dogs and diabetes alert dogs. Conclusions: The study shows that a certified dog is cost-effective in comparison with a regular companion dog for individuals with functional impairments or chronic diseases. Analyses of uncertainty imply that further studies are needed.

Keywords: service dogs, hearing dogs, health economics, Markov model, quality-adjusted, life years

Procedia PDF Downloads 158
37680 Regional Analysis of Freight Movement by Vehicle Classification

Authors: Katerina Koliou, Scott Parr, Evangelos Kaisar

Abstract:

The surface transportation of freight is particularly vulnerable to storm and hurricane disasters, while at the same time, it is the primary transportation mode for delivering medical supplies, fuel, water, and other essential goods. To better plan for commercial vehicles during an evacuation, it is necessary to understand how these vehicles travel during an evacuation and determine if this travel is different from the general public. The research investigation used Florida's statewide continuous-count station traffic volumes, where then compared between years, to identify locations where traffic was moving differently during the evacuation. The data was then used to identify days on which traffic was significantly different between years. While the literature on auto-based evacuations is extensive, the consideration of freight travel is lacking. To better plan for commercial vehicles during an evacuation, it is necessary to understand how these vehicles travel during an evacuation and determine if this travel is different from the general public. The goal of this research was to investigate the movement of vehicles by classification, with an emphasis on freight during two major evacuation events: hurricanes Irma (2017) and Michael (2018). The methodology of the research was divided into three phases: data collection and management, spatial analysis, and temporal comparisons. Data collection and management obtained continuous-co station data from the state of Florida for both 2017 and 2018 by vehicle classification. The data was then processed into a manageable format. The second phase used geographic information systems (GIS) to display where and when traffic varied across the state. The third and final phase was a quantitative investigation into which vehicle classifications were statistically different and on which dates statewide. This phase used a two-sample, two-tailed t-test to compare sensor volume by classification on similar days between years. Overall, increases in freight movement between years prevented a more precise paired analysis. This research sought to identify where and when different classes of vehicles were traveling leading up to hurricane landfall and post-storm reentry. Of the more significant findings, the research results showed that commercial-use vehicles may have underutilized rest areas during the evacuation, or perhaps these rest areas were closed. This may suggest that truckers are driving longer distances and possibly longer hours before hurricanes. Another significant finding of this research was that changes in traffic patterns for commercial-use vehicles occurred earlier and lasted longer than changes for personal-use vehicles. This finding suggests that commercial vehicles are perhaps evacuating in a fashion different from personal use vehicles. This paper may serve as the foundation for future research into commercial travel during evacuations and explore additional factors that may influence freight movements during evacuations.

Keywords: evacuation, freight, travel time, evacuation

Procedia PDF Downloads 72
37679 Formulation and Evaluation of Mouth Dissolving Tablet of Ketorolac Tromethamine by Using Natural Superdisintegrants

Authors: J. P. Lavande, A. V.Chandewar

Abstract:

Mouth dissolving tablet is the speedily growing and highly accepted drug delivery system. This study was aimed at development of Ketorolac Tromethamine mouth dissolving tablet (MDTs), which can disintegrate or dissolve rapidly once placed in the mouth. Conventional Ketorolac tromethamine tablet requires water to swallow it and has limitation like low disintegration rate, low solubility etc. Ketorolac Tromethamine mouth dissolving tablets (formulation) consist of super-disintegrate like Heat Modified Karaya Gum, Co-treated Heat Modified Agar & Filler microcrystalline cellulose (MCC). The tablets were evaluated for weight variation, friability, hardness, in vitro disintegration time, wetting time, in vitro drug release profile, content uniformity. The obtained results showed that low weight variation, good hardness, acceptable friability, fast wetting time. Tablets in all batches disintegrated within 15-50 sec. The formulation containing superdisintegrants namely heat modified karaya gum and heat modified agar showed better performance in disintegration and drug release profile.

Keywords: mouth dissolving tablet, Ketorolac tromethamine, disintegration time, heat modified karaya gum, co-treated heat modified agar

Procedia PDF Downloads 287
37678 Enhancing Patch Time Series Transformer with Wavelet Transform for Improved Stock Prediction

Authors: Cheng-yu Hsieh, Bo Zhang, Ahmed Hambaba

Abstract:

Stock market prediction has long been an area of interest for both expert analysts and investors, driven by its complexity and the noisy, volatile conditions it operates under. This research examines the efficacy of combining the Patch Time Series Transformer (PatchTST) with wavelet transforms, specifically focusing on Haar and Daubechies wavelets, in forecasting the adjusted closing price of the S&P 500 index for the following day. By comparing the performance of the augmented PatchTST models with traditional predictive models such as Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformers, this study highlights significant enhancements in prediction accuracy. The integration of the Daubechies wavelet with PatchTST notably excels, surpassing other configurations and conventional models in terms of Mean Absolute Error (MAE) and Mean Squared Error (MSE). The success of the PatchTST model paired with Daubechies wavelet is attributed to its superior capability in extracting detailed signal information and eliminating irrelevant noise, thus proving to be an effective approach for financial time series forecasting.

Keywords: deep learning, financial forecasting, stock market prediction, patch time series transformer, wavelet transform

Procedia PDF Downloads 58
37677 Assessment of the Efficacy of Routine Medical Tests in Screening Medical Radiation Staff in Shiraz University of Medical Sciences Educational Centers

Authors: Z. Razi, S. M. J. Mortazavi, N. Shokrpour, Z. Shayan, F. Amiri

Abstract:

Long-term exposure to low doses of ionizing radiation occurs in radiation health care workplaces. Although doses in health professions are generally very low, there are still matters of concern. The radiation safety program promotes occupational radiation safety through accurate and reliable monitoring of radiation workers in order to effectively manage radiation protection. To achieve this goal, it has become mandatory to implement health examination periodically. As a result, based on the hematological alterations, working populations with a common occupational radiation history are screened. This paper calls into question the effectiveness of blood component analysis as a screening program which is mandatory for medical radiation workers in some countries. This study details the distribution and trends of changes in blood components, including white blood cells (WBCs), red blood cells (RBCs) and platelets as well as received cumulative doses from occupational radiation exposure. This study was conducted among 199 participants and 100 control subjects at the medical imaging departments at the central hospital of Shiraz University of Medical Sciences during the years 2006–2010. Descriptive and analytical statistics, considering the P-value<0.05 as statistically significance was used for data analysis. The results of this study show that there is no significant difference between the radiation workers and controls regarding WBCs and platelet count during 4 years. Also, we have found no statistically significant difference between the two groups with respect to RBCs. Besides, no statistically significant difference was observed with respect to RBCs with regards to gender, which has been analyzed separately because of the lower reference range for normal RBCs levels in women compared to men and. Moreover, the findings confirm that in a separate evaluation between WBCs count and the personnel’s working experience and their annual exposure dose, results showed no linear correlation between the three variables. Since the hematological findings were within the range of control levels, it can be concluded that the radiation dosage (which was not more than 7.58 mSv in this study) had been too small to stimulate any quantifiable change in medical radiation worker’s blood count. Thus, use of more accurate method for screening program based on the working profile of the radiation workers and their accumulated dose is suggested. In addition, complexity of radiation-induced functions and the influence of various factors on blood count alteration should be taken into account.

Keywords: blood cell count, mandatory testing, occupational exposure, radiation

Procedia PDF Downloads 464
37676 Identification of Crimean-Congo Hemorrhagic Fever Virus in Patients Referred to Ahvaz and Gilan Hospitals in Iran by real-time PCR Technique

Authors: Najmeh Jafari, Sona Rostampour Yasouri

Abstract:

Crimean-Congo hemorrhagic fever (CCHF) is an acute hemorrhagic disease. This disease is one of the common diseases between humans and animals, transmitted through tick bites or contact with the blood and secretions or carcasses of infected animals and humans. CCHF is more common in people who work with livestock, such as ranchers, butchers, farmers, slaughterhouse workers, healthcare workers, etc. Its hospital prevalence is also very high. Considering that CCHF can be transmitted through the consumption of food such as beef and sheep meat, this study aims to quickly identify and diagnose the Crimean-Congo fever virus in suspected patients through real-time PCR technique. In the summer of 1402, 20 blood samples were collected separately from Ahvaz and Gilan hospitals. An extraction kit was used to extract the virus RNA. Primers and probes were designed based on the S genomic region, the conserved region in CCHFV. Then, a real-time PCR technique was performed with specific primers and probes. It should be noted that the mentioned technique was repeated several times. The number of 4 samples from the examined samples was determined positive by real-time PCR. This technique has high sensitivity and specificity and the possibility of rapid detection of CCHFV. Therefore, the above method is a good candidate for quick disease diagnosis. By diagnosing the disease, the treatment process can be done faster, and the best prevention methods can be used to control the disease and prevent the death of patients.

Keywords: ahvaz, crimean-congo hemorrhagic fever, gilan, real time PCR

Procedia PDF Downloads 78
37675 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning

Authors: Jean Berger, Mohamed Barkaoui

Abstract:

Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.

Keywords: search path planning, false alarm, search-and-delivery, entropy, genetic algorithm

Procedia PDF Downloads 364
37674 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 152
37673 Governance Commitment and Time Differences in Aspects of Sustainability Reporting in Nigerian Banks

Authors: Nwobu Obiamaka, Owolabi Akintola

Abstract:

This study examined the extent of statistical significant difference between the economic, environmental, governance and social aspects of sustainability reporting as a result of board committee on sustainability and time (year) of reporting for business organizations in the Nigerian banking sector. The years of reporting under consideration were 2010, 2011, 2012 and 2013. Content analysis methodology was employed through a reporting index used to score the amount of economic, environmental, governance and social indicators of sustainability reporting. The results of this study indicated that business organizations with board committee on sustainability had more indicators of sustainability reporting than those without board committees on sustainability issues. Also, sustainability reporting in 2013 was higher than that of prior years (2012, 2011 and 2010) for the economic, environmental and social indicators. The governance indicators of 2012 was highest compared to the other years (2013, 2011 and 2010) under consideration in this study. The implication of this finding is that business organizations that have board committees on sustainability are monitored by such boards to report more to their stakeholders. On the other hand, business organizations are appreciating the need to engage in sustainability reporting with each passing year. This could be due to the Central Bank of Nigeria (CBN) Sustainability Reporting framework that business organizations in the banking sector have to adhere to. When sustainability issues are monitored from the board of directors, business organizations are likely to increase and improve on their sustainability reporting.

Keywords: governance, organizations, reporting, sustainability

Procedia PDF Downloads 324
37672 Estimating Lost Digital Video Frames Using Unidirectional and Bidirectional Estimation Based on Autoregressive Time Model

Authors: Navid Daryasafar, Nima Farshidfar

Abstract:

In this article, we make attempt to hide error in video with an emphasis on the time-wise use of autoregressive (AR) models. To resolve this problem, we assume that all information in one or more video frames is lost. Then, lost frames are estimated using analogous Pixels time information in successive frames. Accordingly, after presenting autoregressive models and how they are applied to estimate lost frames, two general methods are presented for using these models. The first method which is the same standard method of autoregressive models estimates lost frame in unidirectional form. Usually, in such condition, previous frames information is used for estimating lost frame. Yet, in the second method, information from the previous and next frames is used for estimating the lost frame. As a result, this method is known as bidirectional estimation. Then, carrying out a series of tests, performance of each method is assessed in different modes. And, results are compared.

Keywords: error steganography, unidirectional estimation, bidirectional estimation, AR linear estimation

Procedia PDF Downloads 544
37671 Effect of Enterprise Digital Transformation on Enterprise Growth: Theoretical Logic and Chinese Experience

Authors: Bin Li

Abstract:

In the era of the digital economy, digital transformation has gradually become a strategic choice for enterprise development, but there is a relative lack of systematic research from the perspective of enterprise growth. Based on the sample of Chinese A-share listed companies from 2011 to 2021, this paper constructs A digital transformation index system and an enterprise growth composite index to empirically test the impact of enterprise digital transformation on enterprise growth and its mechanism. The results show that digital transformation can significantly promote corporate growth. The mechanism analysis finds that reducing operating costs, optimizing human capital structure, promoting R&D output and improving digital innovation capability play an important intermediary role in the process of digital transformation promoting corporate growth. At the same time, the level of external digital infrastructure and the strength of organizational resilience play a positive moderating role in the process of corporate digital transformation promoting corporate growth. In addition, while further analyzing the heterogeneity of enterprises, this paper further deepens the analysis of the driving factors and digital technology support of digital transformation, as well as the three dimensions of enterprise growth, thus deepening the research depth of enterprise digital transformation.

Keywords: digital transformation, enterprise growth, digital technology, digital infrastructure, organization resilience, digital innovation

Procedia PDF Downloads 65
37670 Comparison of Two Transcranial Magnetic Stimulation Protocols on Spasticity in Multiple Sclerosis - Pilot Study of a Randomized and Blind Cross-over Clinical Trial

Authors: Amanda Cristina da Silva Reis, Bruno Paulino Venâncio, Cristina Theada Ferreira, Andrea Fialho do Prado, Lucimara Guedes dos Santos, Aline de Souza Gravatá, Larissa Lima Gonçalves, Isabella Aparecida Ferreira Moretto, João Carlos Ferrari Corrêa, Fernanda Ishida Corrêa

Abstract:

Objective: To compare two protocols of Transcranial Magnetic Stimulation (TMS) on quadriceps muscle spasticity in individuals diagnosed with Multiple Sclerosis (MS). Method: Clinical, crossover study, in which six adult individuals diagnosed with MS and spasticity in the lower limbs were randomized to receive one session of high-frequency (≥5Hz) and low-frequency (≤ 1Hz) TMS on motor cortex (M1) hotspot for quadriceps muscle, with a one-week interval between the sessions. To assess the spasticity was applied the Ashworth scale and were analyzed the latency time (ms) of the motor evoked potential (MEP) and the central motor conduction time (CMCT) of the bilateral quadriceps muscle. Assessments were performed before and after each intervention. The difference between groups was analyzed using the Friedman test, with a significance level of 0.05 adopted. Results: All statistical analyzes were performed using the SPSS Statistic version 26 programs, with a significance level established for the analyzes at p<0.05. Shapiro Wilk normality test. Parametric data were represented as mean and standard deviation for non-parametric variables, median and interquartile range, and frequency and percentage for categorical variables. There was no clinical change in quadriceps spasticity assessed using the Ashworth scale for the 1 Hz (p=0.813) and 5 Hz (p= 0.232) protocols for both limbs. Motor Evoked Potential latency time: in the 5hz protocol, there was no significant change for the contralateral side from pre to post-treatment (p>0.05), and for the ipsilateral side, there was a decrease in latency time of 0.07 seconds (p<0.05 ); for the 1Hz protocol there was an increase of 0.04 seconds in the latency time (p<0.05) for the contralateral side to the stimulus, and for the ipsilateral side there was a decrease in the latency time of 0.04 seconds (p=<0.05), with a significant difference between the contralateral (p=0.007) and ipsilateral (p=0.014) groups. Central motor conduction time in the 1Hz protocol, there was no change for the contralateral side (p>0.05) and for the ipsilateral side (p>0.05). In the 5Hz protocol for the contralateral side, there was a small decrease in latency time (p<0.05) and for the ipsilateral side, there was a decrease of 0.6 seconds in the latency time (p<0.05) with a significant difference between groups (p=0.019). Conclusion: A high or low-frequency session does not change spasticity, but it is observed that when the low-frequency protocol was performed, there was an increase in latency time on the stimulated side, and a decrease in latency time on the non-stimulated side, considering then that inhibiting the motor cortex increases cortical excitability on the opposite side.

Keywords: multiple sclerosis, spasticity, motor evoked potential, transcranial magnetic stimulation

Procedia PDF Downloads 94
37669 Characterization of Urban Ozone Pollution in Summer and Analysis of Influencing Factors

Authors: Gao Fangting

Abstract:

Ozone acts as an atmospheric shield, protecting organisms from ultraviolet radiation by absorbing it. Currently, a large amount of international environmental epidemiology has confirmed that short- and long-term exposure to ozone has significant effects on population health. Near-surface ozone, as a secondary pollutant in the atmosphere, not only negatively affects the production activities of living organisms but also damages ecosystems and affects climate change to some extent. In this paper, using the hour-by-hour ozone observations given by ground meteorological stations in four cities, namely Beijing, Kunming, Xining, and Guangzhou, from 2015 to 2017, the number of days of exceedance and the long-term change characteristics of ozone are analyzed by using the time series analysis method. On this basis, the effects of changes in meteorological conditions on ozone concentration were discussed in conjunction with the same period of meteorological data, and the similarities and differences of near-surface ozone in different cities were comparatively analyzed to establish a relevant quantitative model of near-surface ozone. This study found that ozone concentrations were highest during the summer months of the year, that ozone concentrations were strongly correlated with meteorological conditions, and that none of the four cities had ozone concentrations that reached the threshold for causing disease.

Keywords: ozone, meteorological conditions, pollution, health

Procedia PDF Downloads 34
37668 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser

Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett

Abstract:

Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.

Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser

Procedia PDF Downloads 160
37667 A Comparative Study of the Effects of Vibratory Stress Relief and Thermal Aging on the Residual Stress of Explosives Materials

Authors: Xuemei Yang, Xin Sun, Cheng Fu, Qiong Lan, Chao Han

Abstract:

Residual stresses, which can be produced during the manufacturing process of plastic bonded explosive (PBX), play an important role in weapon system security and reliability. Residual stresses can and do change in service. This paper mainly studies the influence of vibratory stress relief (VSR) and thermal aging on residual stress of explosives. Firstly, the residual stress relaxation of PBX via different physical condition of VSR, such as vibration time, amplitude and dynamic strain, were studied by drill-hole technique. The result indicated that the vibratory amplitude, time and dynamic strain had a significant influence on the residual stress relief of PBX. The rate of residual stress relief of PBX increases first and then decreases with the increase of dynamic strain, amplitude and time, because the activation energy is too small to make the PBX yield plastic deformation at first. Then the dynamic strain, time and amplitude exceed a certain threshold, the residual stress changes show the same rule and decrease sharply, this sharply drop of residual stress relief rate may have been caused by over vibration. Meanwhile, the comparison between VSR and thermal aging was also studied. The conclusion is that the reduction ratio of residual stress after VSR process with applicable vibratory parameters could be equivalent to 73% of thermal aging with 7 days. In addition, the density attenuation rate, mechanical property, and dimensional stability with 3 months after VSR process was almost the same compared with thermal aging. However, compared with traditional thermal aging, VSR only takes a very short time, which greatly improves the efficiency of aging treatment for explosive materials. Therefore, the VSR could be a potential alternative technique in the industry of residual stress relaxation of PBX explosives.

Keywords: explosives, residual stresses, thermal aging, vibratory stress relief, VSR

Procedia PDF Downloads 164
37666 Combined Use of Microbial Consortia for the Enhanced Degradation of Type-IIx Pyrethroids

Authors: Parminder Kaur, Chandrajit B. Majumder

Abstract:

The unrestrained usage of pesticides to meet the burgeoning demand of enhanced crop productivity has led to the serious contamination of both terrestrial and aquatic ecosystem. The remediation of mixture of pesticides is a challenging affair regarding inadvertent mixture of pesticides from agricultural lands treated with various compounds. Global concerns about the excessive use of pesticides have driven the need to develop more effective and safer alternatives for their remediation. We focused our work on the microbial degradation of a mixture of three Type II-pyrethroids, namely Cypermethrin, Cyhalothrin and Deltamethrin commonly applied for both agricultural and domestic purposes. The fungal strains (Fusarium strain 8-11P and Fusarium sp. zzz1124) had previously been isolated from agricultural soils and their ability to biotransform this amalgam was studied. In brief, the experiment was conducted in two growth systems (added carbon and carbon-free) enriched with variable concentrations of pyrethroids between 100 to 300 mgL⁻¹. Parameter optimization (pH, temperature, concentration and time) was done using a central composite design matrix of Response Surface Methodology (RSM). At concentrations below 200 mgL⁻¹, complete removal was observed; however, degradation of 95.6%/97.4 and 92.27%/95.65% (in carbon-free/added carbon) was observed for 250 and 300 mgL⁻¹ respectively. The consortium has been shown to degrade the pyrethroid mixture (300 mg L⁻¹) within 120 h. After 5 day incubation, the residual pyrethroids concentration in unsterilized soil were much lower than in sterilized soil, indicating that microbial degradation predominates in pyrethroids elimination with the half-life (t₁/₂) of 1.6 d and R² ranging from 0.992-0.999. Overall, these results showed that microbial consortia might be more efficient than single degrader strains. The findings will complement our current understanding of the bioremediation of mixture of Type II pyrethroids with microbial consortia and potentially heighten the importance for considering bioremediation as an effective alternative for the remediation of such pollutants.

Keywords: bioremediation, fungi, pyrethroids, soil

Procedia PDF Downloads 154
37665 Using Mixed Methods in Studying Classroom Social Network Dynamics

Authors: Nashrawan Naser Taha, Andrew M. Cox

Abstract:

In a multi-cultural learning context, where ties are weak and dynamic, combining qualitative with quantitative research methods may be more effective. Such a combination may also allow us to answer different types of question, such as about people’s perception of the network. In this study the use of observation, interviews and photos were explored as ways of enhancing data from social network questionnaires. Integrating all of these methods was found to enhance the quality of data collected and its accuracy, also providing a richer story of the network dynamics and the factors that shaped these changes over time.

Keywords: mixed methods, social network analysis, multi-cultural learning, social network dynamics

Procedia PDF Downloads 516
37664 Development of PPy-M Composites Materials for Sensor Application

Authors: Yatimah Alias, Tilagam Marimuthu, M. R. Mahmoudian, Sharifah Mohamad

Abstract:

The rapid growth of science and technology in energy and environmental fields has enlightened the substantial importance of the conducting polymer and metal composite materials engineered at nano-scale. In this study, polypyrrole-cobalt composites (PPy-Co Cs) and polypyrrole-nickel oxide composites (PPy-NiO Cs) were prepared by a simple and facile chemical polymerization method with an aqueous solution of pyrrole monomer in the presence of metal salt. These composites then fabricated into non-enzymatic hydrogen peroxide (H2O2) and glucose sensor. The morphology and composition of the composites are characterized by the Field Emission Scanning Electron Microscope, Fourier Transform Infrared Spectrum and X-ray Powder Diffraction. The obtained results were compared with the pure PPy and metal oxide particles. The structural and morphology properties of synthesized composites are different from those of pure PPy and metal oxide particles, which were attributed to the strong interaction between the PPy and the metal particles. Besides, a favorable micro-environment for the electrochemical oxidation of H2O2 and glucose was achieved on the modified glassy carbon electrode (GCE) coated with PPy-Co Cs and PPy-NiO Cs respectively, resulting in an enhanced amperometric response. Both PPy-Co/GCE and PPy-NiO/GCE give high response towards target analyte at optimum condition of 500 μl pyrrole monomer content. Furthermore, the presence of pyrrole monomer greatly increases the sensitivity of the respective modified electrode. The PPy-Co/GCE could detect H2O2 in a linear range of 20 μM to 80 mM with two linear segments (low and high concentration of H2O2) and the detection limit for both ranges is 2.05 μM and 19.64 μM, respectively. Besides, PPy-NiO/GCE exhibited good electrocatalytic behavior towards glucose oxidation in alkaline medium and could detect glucose in linear ranges of 0.01 mM to 0.50 mM and 1 mM to 20 mM with detection limit of 0.33 and 5.77 μM, respectively. The ease of modifying and the long-term stability of this sensor have made it superior to enzymatic sensors, which must kept in a critical environment.

Keywords: metal oxide, composite, non-enzymatic sensor, polypyrrole

Procedia PDF Downloads 268
37663 Assessment of DNA Degradation Using Comet Assay: A Versatile Technique for Forensic Application

Authors: Ritesh K. Shukla

Abstract:

Degradation of biological samples in terms of macromolecules (DNA, RNA, and protein) are the major challenges in the forensic investigation which misleads the result interpretation. Currently, there are no precise methods available to circumvent this problem. Therefore, at the preliminary level, some methods are urgently needed to solve this issue. In this order, Comet assay is one of the most versatile, rapid and sensitive molecular biology technique to assess the DNA degradation. This technique helps to assess DNA degradation even at very low amount of sample. Moreover, the expedient part of this method does not require any additional process of DNA extraction and isolation during DNA degradation assessment. Samples directly embedded on agarose pre-coated microscopic slide and electrophoresis perform on the same slide after lysis step. After electrophoresis microscopic slide stained by DNA binding dye and observed under fluorescent microscope equipped with Komet software. With the help of this technique extent of DNA degradation can be assessed which can help to screen the sample before DNA fingerprinting, whether it is appropriate for DNA analysis or not. This technique not only helps to assess degradation of DNA but many other challenges in forensic investigation such as time since deposition estimation of biological fluids, repair of genetic material from degraded biological sample and early time since death estimation could also be resolved. With the help of this study, an attempt was made to explore the application of well-known molecular biology technique that is Comet assay in the field of forensic science. This assay will open avenue in the field of forensic research and development.

Keywords: comet assay, DNA degradation, forensic, molecular biology

Procedia PDF Downloads 158
37662 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models

Authors: Bipasha Sen, Aditya Agarwal

Abstract:

Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.

Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition

Procedia PDF Downloads 127