Search results for: long lead time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24485

Search results for: long lead time

24095 Improving the Efficiency of Repacking Process with Lean Technique: The Study of Read With Me Group Company Limited

Authors: Jirayut Phetchuen, Jongkol Srithorn

Abstract:

The study examines the unloading and repacking process of Read With Me Group Company Limited. The research aims to improve the old work process and build a new efficient process with the Lean Technique and new machines for faster delivery without increasing the number of employees. Currently, two employees work based on five days on and off. However, workplace injuries have delayed the delivery time, especially the delivery to the neighboring countries. After the process improvement, the working space increased by 25%, the Process Lead Time decreased by 40%, the work efficiency increased by 175.82%, and the work injuries rate was reduced to zero.

Keywords: lean technique, plant layout design, U-shaped disassembly line, value stream mapping

Procedia PDF Downloads 79
24094 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software

Authors: Michael Williams

Abstract:

The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.

Keywords: well control, fluid mechanics, safety, environment

Procedia PDF Downloads 157
24093 A Non-linear Damage Model For The Annulus Of the Intervertebral Disc Under Cyclic Loading, Including Recovery

Authors: Shruti Motiwale, Xianlin Zhou, Reuben H. Kraft

Abstract:

Military and sports personnel are often required to wear heavy helmets for extended periods of time. This leads to excessive cyclic loads on the neck and an increased chance of injury. Computational models offer one approach to understand and predict the time progression of disc degeneration under severe cyclic loading. In this paper, we have applied an analytic non-linear damage evolution model to estimate damage evolution in an intervertebral disc due to cyclic loads over decade-long time periods. We have also proposed a novel strategy for inclusion of recovery in the damage model. Our results show that damage only grows 20% in the initial 75% of the life, growing exponentially in the remaining 25% life. The analysis also shows that it is crucial to include recovery in a damage model.

Keywords: cervical spine, computational biomechanics, damage evolution, intervertebral disc, continuum damage mechanics

Procedia PDF Downloads 553
24092 The Effect of Macroeconomic Policies on Cambodia's Economy: ARDL and VECM Model

Authors: Siphat Lim

Abstract:

This study used Autoregressive Distributed Lag (ARDL) approach to cointegration. In the long-run the general price level and exchange rate have a positively significant effect on domestic output. The estimated result further revealed that fiscal stimulus help stimulate domestic output in the long-run, but not in the short-run, while monetary expansion help to stimulate output in both short-run and long-run. The result is complied with the theory which is the macroeconomic policies, fiscal and monetary policy; help to stimulate domestic output in the long-run. The estimated result of the Vector Error Correction Model (VECM) has indicated more clearly that the consumer price index has a positive effect on output with highly statistically significant. Increasing in the general price level would increase the competitiveness among producers than increase in the output. However, the exchange rate also has a positive effect and highly significant on the gross domestic product. The exchange rate depreciation might increase export since the purchasing power of foreigners has increased. More importantly, fiscal stimulus would help stimulate the domestic output in the long-run since the coefficient of government expenditure is positive. In addition, monetary expansion would also help stimulate the output and the result is highly significant. Thus, fiscal stimulus and monetary expansionary would help stimulate the domestic output in the long-run in Cambodia.

Keywords: fiscal policy, monetary policy, ARDL, VECM

Procedia PDF Downloads 404
24091 Design of Semi-Automatic Vent and Flash Remover

Authors: Inba Blesso P., Senthil Kumar P.

Abstract:

The main consideration of any tire manufacturing process is wear resistance. One of the factors that cause tire wear is improper removal of vent and flash from the tire surface. The contact point between tyre surface and vent is highly supposed to wear. When the vehicle running at higher speed with heavy load, the tire vent and flash is wearing initially and it makes few of the tire surface material to wear along with it. Hence, provision must be given to efficient removal vent and flash thereby tire wear. Human efforts in trimming of tire vent results in time consuming and inaccurate output. Hence, this lead to the reduction in production rate and profit. Thus, the development of automated system can helps to attain minimum time consumption and provide a possible way to get the profitable production. Semi-automated system that employs Pneumatic actuators and sequencing circuits are focused in this study. By implementing this, one can achieve the accurate results with reduction in time and profitable output.

Keywords: tire manufacturing, pneumatic system, vent and flash removal, engineering and technology

Procedia PDF Downloads 357
24090 High Photosensitivity and Broad Spectral Response of Multi-Layered Germanium Sulfide Transistors

Authors: Rajesh Kumar Ulaganathan, Yi-Ying Lu, Chia-Jung Kuo, Srinivasa Reddy Tamalampudi, Raman Sankar, Fang Cheng Chou, Yit-Tsong Chen

Abstract:

In this paper, we report the optoelectronic properties of multi-layered GeS nanosheets (~28 nm thick)-based field-effect transistors (called GeS-FETs). The multi-layered GeS-FETs exhibit remarkably high photoresponsivity of Rλ ~ 206 AW-1 under illumination of 1.5 µW/cm2 at  = 633 nm, Vg = 0 V, and Vds = 10 V. The obtained Rλ ~ 206 AW-1 is excellent as compared with a GeS nanoribbon-based and the other family members of group IV-VI-based photodetectors in the two-dimensional (2D) realm, such as GeSe and SnS2. The gate-dependent photoresponsivity of GeS-FETs was further measured to be able to reach Rλ ~ 655 AW-1 operated at Vg = -80 V. Moreover, the multi-layered GeS photodetector holds high external quantum efficiency (EQE ~ 4.0 × 104 %) and specific detectivity (D* ~ 2.35 × 1013 Jones). The measured D* is comparable to those of the advanced commercial Si- and InGaAs-based photodiodes. The GeS photodetector also shows an excellent long-term photoswitching stability with a response time of ~7 ms over a long period of operation (>1 h). These extraordinary properties of high photocurrent generation, broad spectral range, fast response, and long-term stability make the GeS-FET photodetector a highly qualified candidate for future optoelectronic applications.

Keywords: germanium sulfide, photodetector, photoresponsivity, external quantum efficiency, specific detectivity

Procedia PDF Downloads 516
24089 Foodborne Outbreak Calendar: Application of Time Series Analysis

Authors: Ryan B. Simpson, Margaret A. Waskow, Aishwarya Venkat, Elena N. Naumova

Abstract:

The Centers for Disease Control and Prevention (CDC) estimate that 31 known foodborne pathogens cause 9.4 million cases of these illnesses annually in US. Over 90% of these illnesses are associated with exposure to Campylobacter, Cryptosporidium, Cyclospora, Listeria, Salmonella, Shigella, Shiga-Toxin Producing E.Coli (STEC), Vibrio, and Yersinia. Contaminated products contain parasites typically causing an intestinal illness manifested by diarrhea, stomach cramping, nausea, weight loss, fatigue and may result in deaths in fragile populations. Since 1998, the National Outbreak Reporting System (NORS) has allowed for routine collection of suspected and laboratory-confirmed cases of food poisoning. While retrospective analyses have revealed common pathogen-specific seasonal patterns, little is known concerning the stability of those patterns over time and whether they can be used for preventative forecasting. The objective of this study is to construct a calendar of foodborne outbreaks of nine infections based on the peak timing of outbreak incidence in the US from 1996 to 2017. Reported cases were abstracted from FoodNet for Salmonella (135115), Campylobacter (121099), Shigella (48520), Cryptosporidium (21701), STEC (18022), Yersinia (3602), Vibrio (3000), Listeria (2543), and Cyclospora (758). Monthly counts were compiled for each agent, seasonal peak timing and peak intensity were estimated, and the stability of seasonal peaks and synchronization of infections was examined. Negative Binomial harmonic regression models with the delta-method were applied to derive confidence intervals for the peak timing for each year and overall study period estimates. Preliminary results indicate that five infections continue to lead as major causes of outbreaks, exhibiting steady upward trends with annual increases in cases ranging from 2.71% (95%CI: [2.38, 3.05]) in Campylobacter, 4.78% (95%CI: [4.14, 5.41]) in Salmonella, 7.09% (95%CI: [6.38, 7.82]) in E.Coli, 7.71% (95%CI: [6.94, 8.49]) in Cryptosporidium, and 8.67% (95%CI: [7.55, 9.80]) in Vibrio. Strong synchronization of summer outbreaks were observed, caused by Campylobacter, Vibrio, E.Coli and Salmonella, peaking at 7.57 ± 0.33, 7.84 ± 0.47, 7.85 ± 0.37, and 7.82 ± 0.14 calendar months, respectively, with the serial cross-correlation ranging 0.81-0.88 (p < 0.001). Over 21 years, Listeria and Cryptosporidium peaks (8.43 ± 0.77 and 8.52 ± 0.45 months, respectively) have a tendency to arrive 1-2 weeks earlier, while Vibrio peaks (7.8 ± 0.47) delay by 2-3 weeks. These findings will be incorporated in the forecast models to predict common paths of the spread, long-term trends, and the synchronization of outbreaks across etiological agents. The predictive modeling of foodborne outbreaks should consider long-term changes in seasonal timing, spatiotemporal trends, and sources of contamination.

Keywords: foodborne outbreak, national outbreak reporting system, predictive modeling, seasonality

Procedia PDF Downloads 109
24088 A Multilevel Approach of Reproductive Preferences and Subsequent Behavior in India

Authors: Anjali Bansal

Abstract:

Reproductive preferences mainly deal with two questions: when a couple wants children and how many they want. Questions related to these desires are often included in the fertility surveys as they can provide relevant information on the subsequent behavior. The aim of the study is to observe whether respondent’s response to these questions changed over time or not. We also tried to identify socio- economic and demographic factors associated with the stability (or instability) of fertility preferences. For this purpose, we used IHDS1 (2004-05) and follow up survey IHDS2 (2011-12) data and applied bivariate, multivariate and multilevel repeated measure analysis to it to find the consistency between responses. From the analysis, we found that preferences of women changes over the course of time as from the bivariate analysis we have found that 52% of women are not consistent in their desired family size and huge inconsistency are found in desire to continue childbearing. To get a better overlook of these inconsistencies, we have computed Intra Class Correlation (ICC) which tries to explain the consistency between individuals on their fertility responses at two time periods. We also explored that husband’s desire for additional child specifically male offspring contribute to these variations. Our findings lead us to a cessation that in India, individuals fertility preferences changed over a seven-year time period as the Intra Class correlation comes out to be very small which explains the variations among individuals. Concerted efforts should be made, therefore, to educate people, and conduct motivational programs to promote family planning for family welfare.

Keywords: change, consistency, preferences, over time

Procedia PDF Downloads 149
24087 Scheduling Algorithm Based on Load-Aware Queue Partitioning in Heterogeneous Multi-Core Systems

Authors: Hong Kai, Zhong Jun Jie, Chen Lin Qi, Wang Chen Guang

Abstract:

There are inefficient global scheduling parallelism and local scheduling parallelism prone to processor starvation in current scheduling algorithms. Regarding this issue, this paper proposed a load-aware queue partitioning scheduling strategy by first allocating the queues according to the number of processor cores, calculating the load factor to specify the load queue capacity, and it assigned the awaiting nodes to the appropriate perceptual queues through the precursor nodes and the communication computation overhead. At the same time, real-time computation of the load factor could effectively prevent the processor from being starved for a long time. Experimental comparison with two classical algorithms shows that there is a certain improvement in both performance metrics of scheduling length and task speedup ratio.

Keywords: load-aware, scheduling algorithm, perceptual queue, heterogeneous multi-core

Procedia PDF Downloads 120
24086 Analysis of Real Time Seismic Signal Dataset Using Machine Learning

Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.

Abstract:

Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.

Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection

Procedia PDF Downloads 95
24085 Language Switching Errors of Bilinguals: Role of Top down and Bottom up Process

Authors: Numra Qayyum, Samina Sarwat, Noor ul Ain

Abstract:

Bilingual speakers generally can speak both languages with the same competency without mixing them intentionally and making mistakes, but sometimes errors occur in language selection. This quantitative study particularly deals with the language errors made by Urdu-English bilinguals. In this research, researchers have given special attention to the part played by bottom-up priming and top-down cognitive control in these errors. Unstable Urdu-English bilingual participants termed pictures and were prompted to shift from one language to another under the pressure of time. Different situations were given to manipulate the participants. The long and short runs trials of the same language were also given before switching to another language. The study is concluded with the findings that bilinguals made more errors when switching to the first language from their second language, and these errors are large in number, especially when a speaker is switching from L2 (second language) to L1 (first language) after a long run. When the switching is reversed, i.e., from L2 to LI, it had no effect at all. These results gave the clear responsibility of all these errors to top-down cognitive control.

Keywords: bottom up priming, language error, language switching, top down cognitive control

Procedia PDF Downloads 114
24084 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 21
24083 Spectroscopic Study of Eu³⁺ Ions Doped Potassium Lead Alumino Borate Glasses for Photonic Device Application

Authors: Nisha Deopa, Allam Srinivasa Rao

Abstract:

Quaternary potassium lead alumino borate (KPbAlB) glasses doped with different concentration of Eu³⁺ ions have been synthesized by melt quench technique and characterized by X-ray diffraction (XRD), Scanning electron microscope (SEM), Photoluminescence (PL), Time-resolved photoluminescence (TRPL) and CIE-chromaticity co-ordinates to study their luminescence behavior. A broad hump was observed in XRD spectrum confirms glassy nature of as-prepared glasses. By using Judd-Ofelt (J-O) theory, various radiative parameters for the prominent fluorescent levels of Eu³⁺ have been investigated. The intense emission peak was observed at 613 nm (⁵D₀→⁷F₂) under 393 nm excitation, matches well with the excitation of n-UV LED chips. The decay profiles observed for ⁵D₀ level were exponential for lower Eu³⁺ ion concentration while non-exponential for higher concentration, which may be due to efficient energy transfer between Eu³⁺-Eu³⁺ through cross relaxation and subsequent quenching observed. From the emission cross-sections, branching ratios, quantum efficiency and CIE coordinates, it was concluded that 7 mol % of Eu³⁺ ion concentration (glass B) is optimum in KPbAlB glasses for photonic device application.

Keywords: energy transfer, glasses, J-O parameters, photoluminescence

Procedia PDF Downloads 140
24082 Coupled Space and Time Homogenization of Viscoelastic-Viscoplastic Composites

Authors: Sarra Haouala, Issam Doghri

Abstract:

In this work, a multiscale computational strategy is proposed for the analysis of structures, which are described at a refined level both in space and in time. The proposal is applied to two-phase viscoelastic-viscoplastic (VE-VP) reinforced thermoplastics subjected to large numbers of cycles. The main aim is to predict the effective long time response while reducing the computational cost considerably. The proposed computational framework is a combination of the mean-field space homogenization based on the generalized incrementally affine formulation for VE-VP composites, and the asymptotic time homogenization approach for coupled isotropic VE-VP homogeneous solids under large numbers of cycles. The time homogenization method is based on the definition of micro and macro-chronological time scales, and on asymptotic expansions of the unknown variables. First, the original anisotropic VE-VP initial-boundary value problem of the composite material is decomposed into coupled micro-chronological (fast time scale) and macro-chronological (slow time-scale) problems. The former is purely VE, and solved once for each macro time step, whereas the latter problem is nonlinear and solved iteratively using fully implicit time integration. Second, mean-field space homogenization is used for both micro and macro-chronological problems to determine the micro and macro-chronological effective behavior of the composite material. The response of the matrix material is VE-VP with J2 flow theory assuming small strains. The formulation exploits the return-mapping algorithm for the J2 model, with its two steps: viscoelastic predictor and plastic corrections. The proposal is implemented for an extended Mori-Tanaka scheme, and verified against finite element simulations of representative volume elements, for a number of polymer composite materials subjected to large numbers of cycles.

Keywords: asymptotic expansions, cyclic loadings, inclusion-reinforced thermoplastics, mean-field homogenization, time homogenization

Procedia PDF Downloads 345
24081 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 321
24080 Optimization of Synergism Extraction of Toxic Metals (Lead, Copper) from Chlorides Solutions with Mixture of Cationic and Solvating Extractants

Authors: F. Hassaine-Sadi, S. Chelouaou

Abstract:

In recent years, environmental contamination by toxic metals such as Pb, Cu, Ni, Zn ... has become a worldwide crucial problem, particularly in some areas where the population depends on groundwater for drinking daily consumption. Thus, the sources of metal ions come from the metal manufacturing industry, fertilizers, batteries, paints, pigments and so on. Solvent extraction of metal ions has given an important role in the development of metal purification processes such as the synergistic extraction of some divalent cations metals ( M²⁺), the ions metals from various sources. This work consists of a water purification technique that involves the lead and copper systems: Pb²⁺, H₃O+, Cl⁻ and Cu²⁺, H₃O⁺, Cl⁻ for diluted solutions by a mixture of tri-n-octylphosphine oxide (TOPO) or Tri-n-butylphosphate(TBP) and di (2-ethyl hexyl) phosphoric acid (HDEHP) dissolved in kerosene. The study of the fundamental parameters influencing the extraction synergism: cation exchange/extraction solvent have been examined.

Keywords: synergistic extraction, lead, copper, environment

Procedia PDF Downloads 421
24079 Microglia Activation in Animal Model of Schizophrenia

Authors: Esshili Awatef, Manitz Marie-Pierre, Eßlinger Manuela, Gerhardt Alexandra, Plümper Jennifer, Wachholz Simone, Friebe Astrid, Juckel Georg

Abstract:

Maternal immune activation (MIA) resulting from maternal viral infection during pregnancy is a known risk factor for schizophrenia. The neural mechanisms by which maternal infections increase the risk for schizophrenia remain unknown, although the prevailing hypothesis argues that an activation of the maternal immune system induces changes in the maternal-fetal environment that might interact with fetal brain development. It may lead to an activation of fetal microglia inducing long-lasting functional changes of these cells. Based on post-mortem analysis showing an increased number of activated microglial cells in patients with schizophrenia, it can be hypothesized that these cells contribute to disease pathogenesis and may actively be involved in gray matter loss observed in such patients. In the present study, we hypothesize that prenatal treatment with the inflammatory agent Poly(I:C) during embryogenesis at contributes to microglial activation in the offspring, which may, therefore, represent a contributing factor to the pathogenesis of schizophrenia and underlines the need for new pharmacological treatment options. Pregnant rats were treated with intraperitoneal injections a single dose of Poly(I:C) or saline on gestation day 17. Brains of control and Poly(I:C) offspring, were removed and into 20-μm-thick coronal sections were cut by using a Cryostat. Brain slices were fixed and immunostained with ba1 antibody. Subsequently, Iba1-immunoreactivity was detected using a secondary antibody, goat anti-rabbit. The sections were viewed and photographed under microscope. The immunohistochemical analysis revealed increases in microglia cell number in the prefrontal cortex, in offspring of poly(I:C) treated-rats as compared to the controls injected with NaCl. However, no significant differences were observed in microglia activation in the cerebellum among the groups. Prenatal immune challenge with Poly(I:C) was able to induce long-lasting changes in the offspring brains. This lead to a higher activation of microglia cells in the prefrontal cortex, a brain region critical for many higher brain functions, including working memory and cognitive flexibility. which might be implicated in possible changes in cortical neuropil architecture in schizophrenia. Further studies will be needed to clarify the association between microglial cells activation and schizophrenia-related behavioral alterations.

Keywords: Microglia, neuroinflammation, PolyI:C, schizophrenia

Procedia PDF Downloads 401
24078 Determination of Surface Deformations with Global Navigation Satellite System Time Series

Authors: Ibrahim Tiryakioglu, Mehmet Ali Ugur, Caglar Ozkaymak

Abstract:

The development of GNSS technology has led to increasingly widespread and successful applications of GNSS surveys for monitoring crustal movements. However, multi-period GPS survey solutions have not been applied in monitoring vertical surface deformation. This study uses long-term GNSS time series that are required to determine vertical deformations. In recent years, the surface deformations that are parallel and semi-parallel to Bolvadin fault have occurred in Western Anatolia. These surface deformations have continued to occur in Bolvadin settlement area that is located mostly on alluvium ground. Due to these surface deformations, a number of cracks in the buildings located in the residential areas and breaks in underground water and sewage systems have been observed. In order to determine the amount of vertical surface deformations, two continuous GNSS stations have been established in the region. The stations have been operating since 2015 and 2017, respectively. In this study, GNSS observations from the mentioned two GNSS stations were processed with GAMIT/GLOBK (GNSS Analysis Massachusetts Institute of Technology/GLOBal Kalman) program package to create a coordinate time series. With the time series analyses, the GNSS stations’ behavior models (linear, periodical, etc.), the causes of these behaviors, and mathematical models were determined. The study results from the time series analysis of these two 2 GNSS stations shows approximately 50-80 mm/yr vertical movement.

Keywords: Bolvadin fault, GAMIT, GNSS time series, surface deformations

Procedia PDF Downloads 148
24077 Empirical Acceleration Functions and Fuzzy Information

Authors: Muhammad Shafiq

Abstract:

In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.

Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data

Procedia PDF Downloads 277
24076 Design of a Portable Shielding System for a Newly Installed NaI(Tl) Detector

Authors: Mayesha Tahsin, A.S. Mollah

Abstract:

Recently, a 1.5x1.5 inch NaI(Tl) detector based gamma-ray spectroscopy system has been installed in the laboratory of the Nuclear Science and Engineering Department of the Military Institute of Science and Technology for radioactivity detection purposes. The newly installed NaI(Tl) detector has a circular lead shield of 22 mm width. An important consideration of any gamma-ray spectroscopy is the minimization of natural background radiation not originating from the radioactive sample that is being measured. Natural background gamma-ray radiation comes from naturally occurring or man-made radionuclides in the environment or from cosmic sources. Moreover, the main problem with this system is that it is not suitable for measurements of radioactivity with a large sample container like Petridish or Marinelli beaker geometry. When any laboratory installs a new detector or/and new shield, it “must” first carry out quality and performance tests for the detector and shield. This paper describes a new portable shielding system with lead that can reduce the background radiation. Intensity of gamma radiation after passing the shielding will be calculated using shielding equation I=Ioe-µx where Io is initial intensity of the gamma source, I is intensity after passing through the shield, µ is linear attenuation coefficient of the shielding material, and x is the thickness of the shielding material. The height and width of the shielding will be selected in order to accommodate the large sample container. The detector will be surrounded by a 4π-geometry low activity lead shield. An additional 1.5 mm thick shield of tin and 1 mm thick shield of copper covering the inner part of the lead shielding will be added in order to remove the presence of characteristic X-rays from the lead shield.

Keywords: shield, NaI (Tl) detector, gamma radiation, intensity, linear attenuation coefficient

Procedia PDF Downloads 135
24075 Oncogenic Functions of Long Non-Coding RNA XIST in Human Nasopharyngeal Carcinoma by Targeting MiR-34a-5p

Authors: Cheng-Cao Sun, Shu-Jun Li, De-Jia Li

Abstract:

Long non-coding RNA (lncRNA) X inactivate-specific transcript (XIST) has been verified as an oncogenic gene in several human malignant tumors, and its dysregulation was closed associated with tumor initiation, development and progression. Nevertheless, whether the aberrant expression of XIST in human nasopharyngeal carcinoma (NPC) is corrected with malignancy, metastasis or prognosis has not been elaborated. Here, we discovered that XIST was up-regulated in NPC tissues and higher expression of XIST contributed to a markedly poorer survival time. In addition, multivariate analysis demonstrated XIST was an independent risk factor for prognosis. XIST over-expression enhanced, while XIST silencing hampered the cell growth in NPC. Additionally, mechanistic analysis revealed that XIST up-regulated the expression of miR-34a-5p targeted gene E2F3 through acting as a competitive ‘sponge’ of miR-34a-5p. Taking all into account, we concluded that XIST functioned as an oncogene in NPC through up-regulating E2F3 in part through ‘spongeing’ miR-34a-5p.

Keywords: X inactivate-specific transcript; hsa-miRNA-34a-5p, miR-34a-5p; E2F3, nasopharyngeal carcinoma, tumorigenesis

Procedia PDF Downloads 224
24074 A Methodology for Characterising the Tail Behaviour of a Distribution

Authors: Serge Provost, Yishan Zang

Abstract:

Following a review of various approaches that are utilized for classifying the tail behavior of a distribution, an easily implementable methodology that relies on an arctangent transformation is presented. The classification criterion is actually based on the difference between two specific quantiles of the transformed distribution. The resulting categories enable one to classify distributional tails as distinctly short, short, nearly medium, medium, extended medium and somewhat long, providing that at least two moments exist. Distributions possessing a single moment are said to be long tailed while those failing to have any finite moments are classified as having an extremely long tail. Several illustrative examples will be presented.

Keywords: arctangent transformation, tail classification, heavy-tailed distributions, distributional moments

Procedia PDF Downloads 104
24073 A Fluorescent Polymeric Boron Sensor

Authors: Soner Cubuk, Mirgul Kosif, M. Vezir Kahraman, Ece Kok Yetimoglu

Abstract:

Boron is an essential trace element for the completion of the life circle for organisms. Suitable methods for the determination of boron have been proposed, including acid - base titrimetric, inductively coupled plasma emission spectroscopy flame atomic absorption and spectrophotometric. However, the above methods have some disadvantages such as long analysis times, requirement of corrosive media such as concentrated sulphuric acid and multi-step sample preparation requirements and time-consuming procedures. In this study, a selective and reusable fluorescent sensor for boron based on glycosyloxyethyl methacrylate was prepared by photopolymerization. The response characteristics such as response time, pH, linear range, limit of detection were systematically investigated. The excitation/emission maxima of the membrane were at 378/423 nm, respectively. The approximate response time was measured as 50 sec. In addition, sensor had a very low limit of detection which was 0.3 ppb. The sensor was successfully used for the determination of boron in water samples with satisfactory results.

Keywords: boron, fluorescence, photopolymerization, polymeric sensor

Procedia PDF Downloads 262
24072 A Fast Calculation Approach for Position Identification in a Distance Space

Authors: Dawei Cai, Yuya Tokuda

Abstract:

The market of localization based service (LBS) is expanding. The acquisition of physical location is the fundamental basis for LBS. GPS, the de facto standard for outdoor localization, does not work well in indoor environment due to the blocking of signals by walls and ceiling. To acquire high accurate localization in an indoor environment, many techniques have been developed. Triangulation approach is often used for identifying the location, but a heavy and complex computation is necessary to calculate the location of the distances between the object and several source points. This computation is also time and power consumption, and not favorable to a mobile device that needs a long action life with battery. To provide a low power consumption approach for a mobile device, this paper presents a fast calculation approach to identify the location of the object without online solving solutions to simultaneous quadratic equations. In our approach, we divide the location identification into two parts, one is offline, and other is online. In offline mode, we make a mapping process that maps the location area to distance space and find a simple formula that can be used to identify the location of the object online with very light computation. The characteristic of the approach is a good tradeoff between the accuracy and computational amount. Therefore, this approach can be used in smartphone and other mobile devices that need a long work time. To show the performance, some simulation experimental results are provided also in the paper.

Keywords: indoor localization, location based service, triangulation, fast calculation, mobile device

Procedia PDF Downloads 152
24071 Preparation and Characterization of Nanocrystalline Cellulose from Acacia mangium

Authors: Samira Gharehkhani, Seyed Farid Seyed Shirazi, Abdolreza Gharehkhani, Hooman Yarmand, Ahmad Badarudin, Rushdan Ibrahim, Salim Newaz Kazi

Abstract:

Nanocrystalline cellulose (NCC) were prepared by acid hydrolysis and ultrasound treatment of bleached Acacia mangium fibers. The obtained rod-shaped nanocrystals showed a uniform size. The results showed that NCC with high crystallinity can be obtained using 64 wt% sulfuric acid. The effect of synthesis condition was investigated. Different reaction times were examined to produce the NCC and the results revealed that an optimum reaction time has to be used for preparing the NCC. Morphological investigation was performed using the transmission electron microscopy (TEM). Fourier transform infrared (FTIR) spectroscopy and thermogravimetric analysis (TGA) were performed. X-ray diffraction (XRD) analysis revealed that the crystallinity increased with successive treatments. The NCC suspension was homogeneous and stable and no sedimentation was observed for a long time.

Keywords: acid hydrolysis, nanocrystalline cellulose, nano material, reaction time

Procedia PDF Downloads 486
24070 MegaProjects and the Governing Processes That Lead to Success and Failure: A Literature Review

Authors: Fangwei Zhu, Wei Tian, Linzhuo Wang, Miao Yu

Abstract:

Megaproject has long been a critical issue in project governance, for its low success rate and large impact on society. Although the extant literature on megaproject governance is vast, to our best knowledge, the lacking of a thorough literature review makes it hard for us to gain a holistic view on current scenario of megaproject governance. The study conducts a systematic literature review process to analyze the existing literatures on megaproject governance. The finding indicates that mega project governance needs to be handled at network level and forming a network level governance provides a holistic framework for governing megaproject towards sustainable development of the projects. Theoretical and practical implications, as well as future studies and limitations, were discussed.

Keywords: megaproject, governance, literature review, network

Procedia PDF Downloads 181
24069 The Optimal Order Policy for the Newsvendor Model under Worker Learning

Authors: Sunantha Teyarachakul

Abstract:

We consider the worker-learning Newsvendor Model, under the case of lost-sales for unmet demand, with the research objective of proposing the cost-minimization order policy and lot size, scheduled to arrive at the beginning of the selling-period. In general, the New Vendor Model is used to find the optimal order quantity for the perishable items such as fashionable products or those with seasonal demand or short-life cycles. Technically, it is used when the product demand is stochastic and available for the single selling-season, and when there is only a one time opportunity for the vendor to purchase, with possibly of long ordering lead-times. Our work differs from the classical Newsvendor Model in that we incorporate the human factor (specifically worker learning) and its influence over the costs of processing units into the model. We describe this by using the well-known Wright’s Learning Curve. Most of the assumptions of the classical New Vendor Model are still maintained in our work, such as the constant per-unit cost of leftover and shortage, the zero initial inventory, as well as the continuous time. Our problem is challenging in the way that the best order quantity in the classical model, which is balancing the over-stocking and under-stocking costs, is no longer optimal. Specifically, when adding the cost-saving from worker learning to such expected total cost, the convexity of the cost function will likely not be maintained. This has called for a new way in determining the optimal order policy. In response to such challenges, we found a number of characteristics related to the expected cost function and its derivatives, which we then used in formulating the optimal ordering policy. Examples of such characteristics are; the optimal order quantity exists and is unique if the demand follows a Uniform Distribution; if the demand follows the Beta Distribution with some specific properties of its parameters, the second derivative of the expected cost function has at most two roots; and there exists the specific level of lot size that satisfies the first order condition. Our research results could be helpful for analysis of supply chain coordination and of the periodic review system for similar problems.

Keywords: inventory management, Newsvendor model, order policy, worker learning

Procedia PDF Downloads 394
24068 The Limits to Self-Defense Claims in Case of Domestic Violence Homicides

Authors: Maria Elisabete Costa Ferreira

Abstract:

Domestic violence is a serious social issue in which victims are mostly women. Domestic violence develops in cycles, starting with the building of tension, passing through the incident of abuse and ending with reconciliation, also known as honeymoon. As time goes by, the shorter these phases become, and the greater and more severe the attacks, rarely leading to the death of the victim of abuse. Sometimes, the victim stops the abuse by killing the aggressor, usually after the immediate aggression has taken place. This poses an important obstacle to the claim of self-defense by the victim of domestic violence pending trial for the homicide of her long-time abuser. The main problem with self-defense claims in such cases is that the law requires the act of aggression to be present or imminent (imminent threat or immediate danger) so that it permits the victim to take her defense into her own hands. If the episode of aggression has already taken place, this general requirement for the admissibility of self-defense is not satisfied. This paper sheds new light on the concept of the actuality of the aggression, understanding that, since domestic violence is a permanent offense, for as long as the victim stays under the domain of the aggressor, imminent threat will be present, allowing the self-defense claim of a woman who kills her abuser in such circumstances to be admissible. An actualist interpretation of the requirement of the necessity of the means used in self-defense will be satisfied when evaluated from the subjective perspective of the intimate partner victim. Necessity will be satisfied if it is reasonable for the victim to perceive the use of lethal force as the only means to release herself from the abuser.

Keywords: domestic violence, homicide, self-defense, imminent threat, necessity of lethal force

Procedia PDF Downloads 47
24067 The Dynamics of Algeria’s Natural Gas Exports to Europe: Evidence from ARDL Bounds Testing Approach with Breakpoints

Authors: Hicham Benamirouche, Oum Elkheir Moussi

Abstract:

The purpose of the study is to examine the dynamics of Algeria’s natural gas exports through the Autoregressive Distributed Lag (ARDL) bounds testing approach with break points. The analysis was carried out for the period from 1967 to 2015. Based on imperfect substitution specification, the ARDL approach reveals a long-run equilibrium relationship between Algeria’s Natural gas exports and their determinant factors (Algeria’s gas reserves, Domestic gas consumption, Europe’s GDP per capita, relative prices, the European gas production and the market share of competitors). All the long-run elasticities estimated are statistically significant with a large impact of domestic factors, which constitute the supply constraints. In short term, the elasticities are statistically significant, and almost comparable to those of the long term. Furthermore, the speed of adjustment towards long-run equilibrium is less than one year because of the little flexibility of the long term export contracts. Two break points have been estimated when we employ the domestic gas consumption as a break variable; 1984 and 2010, which reflect the arbitration policy between the domestic gas market and gas exports.

Keywords: natural gas exports, elasticity, ARDL bounds testing, break points, Algeria

Procedia PDF Downloads 175
24066 Reducing Friction Associated with Commercial Use of Biomimetics While Increasing the Potential for Using Eco Materials and Design in Industry

Authors: John P. Ulhøi

Abstract:

Firms are faced with pressure to stay innovative and entrepreneurial while at the same time leaving lighter ecological footprints. Traditionally inspiration for new product development (NPD) has come from the creative in-house staff and from the marketplace. Often NPD offered by this approach has proven to be (far from) optimal for its purpose or highly (resource and energy) efficient. More recently, a bio-inspired NPD approach has surfaced under the banner of biomimetics. Biomimetics refers to inspiration from and translations of designs, systems, processes, and or specific properties that exist in nature. The principles and structures working in nature have evolved over a long period of time enable them to be optimized for the purpose and resource and energy-efficient. These characteristics reflect the raison d'être behind the field of biomimetics. While biological expertise is required to understand and explain such natural and biological principles and structures, engineers are needed to translate biological design and processes into synthetic applications. It can, therefore, hardly be surprising, biomimetics long has gained a solid foothold in both biology and engineering. The commercial adoption of biomimetic applications in new production development (NDP) in industry, however, does not quite reflect a similar growth. Differently put, this situation suggests that something is missing in the biomimetic-NPD-equation, thus acting as a brake towards the wider commercial application of biomimetics and thus the use of eco-materials and design in the industry. This paper closes some of that gap. Before concluding, avenues for future research and implications for practice will be briefly sketched out.

Keywords: biomimetics, eco-materials, NPD, commercialization

Procedia PDF Downloads 146