Search results for: real time expression
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21824

Search results for: real time expression

17984 Induction Motor Eccentricity Fault Recognition Using Rotor Slot Harmonic with Stator Current Technique

Authors: Nouredine Benouzza, Ahmed Hamida Boudinar, Azeddine Bendiabdellah

Abstract:

An algorithm for Eccentricity Fault Detection (EFD) applied to a squirrel cage induction machine is proposed in this paper. This algorithm employs the behavior of the stator current spectral analysis and the localization of the Rotor Slot Harmonic (RSH) frequency to detect eccentricity faults in three phase induction machine. The RHS frequency once obtained is used as a key parameter into a simple developed expression to directly compute the eccentricity fault frequencies in the induction machine. Experimental tests performed for both a healthy motor and a faulty motor with different eccentricity fault severities illustrate the effectiveness and merits of the proposed EFD algorithm.

Keywords: squirrel cage motor, diagnosis, eccentricity faults, current spectral analysis, rotor slot harmonic

Procedia PDF Downloads 465
17983 New Machine Learning Optimization Approach Based on Input Variables Disposition Applied for Time Series Prediction

Authors: Hervice Roméo Fogno Fotsoa, Germaine Djuidje Kenmoe, Claude Vidal Aloyem Kazé

Abstract:

One of the main applications of machine learning is the prediction of time series. But a more accurate prediction requires a more optimal model of machine learning. Several optimization techniques have been developed, but without considering the input variables disposition of the system. Thus, this work aims to present a new machine learning architecture optimization technique based on their optimal input variables disposition. The validations are done on the prediction of wind time series, using data collected in Cameroon. The number of possible dispositions with four input variables is determined, i.e., twenty-four. Each of the dispositions is used to perform the prediction, with the main criteria being the training and prediction performances. The results obtained from a static architecture and a dynamic architecture of neural networks have shown that these performances are a function of the input variable's disposition, and this is in a different way from the architectures. This analysis revealed that it is necessary to take into account the input variable's disposition for the development of a more optimal neural network model. Thus, a new neural network training algorithm is proposed by introducing the search for the optimal input variables disposition in the traditional back-propagation algorithm. The results of the application of this new optimization approach on the two single neural network architectures are compared with the previously obtained results step by step. Moreover, this proposed approach is validated in a collaborative optimization method with a single objective optimization technique, i.e., genetic algorithm back-propagation neural networks. From these comparisons, it is concluded that each proposed model outperforms its traditional model in terms of training and prediction performance of time series. Thus the proposed optimization approach can be useful in improving the accuracy of time series forecasts. This proves that the proposed optimization approach can be useful in improving the accuracy of time series prediction based on machine learning.

Keywords: input variable disposition, machine learning, optimization, performance, time series prediction

Procedia PDF Downloads 86
17982 The First Import of Yellow Fever Cases in China and Its Revealing Suggestions for the Control and Prevention of Imported Emerging Diseases

Authors: Chao Li, Lei Zhou, Ruiqi Ren, Dan Li, Yali Wang, Daxin Ni, Zijian Feng, Qun Li

Abstract:

Background: In 2016, yellow fever had been first ever discovered in China, soon after the yellow fever epidemic occurred in Angola. After the discovery, China had promptly made the national protocol of control and prevention and strengthened the surveillance on passenger and vector. In this study, a descriptive analysis was conducted to summarize China’s experiences of response towards this import epidemic, in the hope of providing experiences on prevention and control of yellow fever and other similar imported infectious diseases in the future. Methods: The imported cases were discovered and reported by General Administration of Quality Supervision, Inspection and Quarantine (AQSIQ) and several hospitals. Each clinically diagnosed yellow fever case was confirmed by real-time reverse transcriptase polymerase chain reaction (RT–PCR). The data of the imported yellow fever cases were collected by local Centers for Disease Control and Prevention (CDC) through field investigations soon after they received the reports. Results: A total of 11 imported cases from Angola were reported in China, during Angola’s yellow fever outbreak. Six cases were discovered by the AQSIQ, among which two with mild symptom were initiative declarations at the time of entry. Except for one death, the remaining 10 cases all had recovered after timely and proper treatment. All cases are Chinese, and lived in Luanda, the capital of Angola. 73% were retailers (8/11) from Fuqing city in Fujian province, and the other three were labors send by companies. 10 cases had experiences of medical treatment in Luanda after onset, among which 8 cases visited the same local Chinese medicine hospital (China Railway four Bureau Hospital). Among the 11 cases, only one case had an effective vaccination. The result of emergency surveillance for mosquito density showed that only 14 containers of water were found positive around places of three cases, and the Breteau Index is 15. Conclusions: Effective response was taken to control and prevent the outbreak of yellow fever in China after discovering the imported cases. However, though the similar origin of Chinese in Angola has provided an easy access for disease detection, information sharing, health education and vaccination on yellow fever; these conveniences were overlooked during previous disease prevention methods. Besides, only one case having effective vaccination revealed the inadequate capacity of immunization service in China. These findings will provide suggestions to improve China’s capacity to deal with not only yellow fever but also other similar imported diseases in China.

Keywords: yellow fever, first import, China, suggestion

Procedia PDF Downloads 175
17981 Hybrid Thresholding Lifting Dual Tree Complex Wavelet Transform with Wiener Filter for Quality Assurance of Medical Image

Authors: Hilal Naimi, Amelbahahouda Adamou-Mitiche, Lahcene Mitiche

Abstract:

The main problem in the area of medical imaging has been image denoising. The most defying for image denoising is to secure data carrying structures like surfaces and edges in order to achieve good visual quality. Different algorithms with different denoising performances have been proposed in previous decades. More recently, models focused on deep learning have shown a great promise to outperform all traditional approaches. However, these techniques are limited to the necessity of large sample size training and high computational costs. This research proposes a denoising approach basing on LDTCWT (Lifting Dual Tree Complex Wavelet Transform) using Hybrid Thresholding with Wiener filter to enhance the quality image. This research describes the LDTCWT as a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). To develop this approach, a hybrid thresholding function is modeled by integrating the Wiener filter into the thresholding function.

Keywords: lifting wavelet transform, image denoising, dual tree complex wavelet transform, wavelet shrinkage, wiener filter

Procedia PDF Downloads 147
17980 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained

Authors: Homa Ghave, Parmis Shahmaleki

Abstract:

This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.

Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function

Procedia PDF Downloads 250
17979 Net Folklore as a Part of Kazakhstani Internet Literature

Authors: Dina Sabirova, Madina Moldagali

Abstract:

The rapid development of new media, especially the Internet, has led to major changes in folk culture. The net space is increasingly becoming a creation of the ‘folk’ imagination, saturated with multimedia stories with collective authorship, like traditional folklore. Moreover, the Internet picks up and changes old folklore traditions, such as the form of publication, the way of storytelling, or gave a new morality to the ‘old tales’. In this article, the similarities and differences between Internet folklore/ cyber-folklore/ digital folklore and oral folk art were examined by using the material of modern Kazakh authors. The relationship between tradition and innovation was studied in order to interpret the sequence of the authors' research taking into account the realities. The material of the article was the prose texts of Kazakh writers published in internet magazines and social networks. An immanent and intertextual analysis of the text was carried out. Thus, the new forms of Internet folklore lead to new forms of expression and social morality in society

Keywords: internet literature, modern Kazakhstani authors, net folklore, oral folk art

Procedia PDF Downloads 85
17978 Method of Complex Estimation of Text Perusal and Indicators of Reading Quality in Different Types of Commercials

Authors: Victor N. Anisimov, Lyubov A. Boyko, Yazgul R. Almukhametova, Natalia V. Galkina, Alexander V. Latanov

Abstract:

Modern commercials presented on billboards, TV and on the Internet contain a lot of information about the product or service in text form. However, this information cannot always be perceived and understood by consumers. Typical sociological focus group studies often cannot reveal important features of the interpretation and understanding information that has been read in text messages. In addition, there is no reliable method to determine the degree of understanding of the information contained in a text. Only the fact of viewing a text does not mean that consumer has perceived and understood the meaning of this text. At the same time, the tools based on marketing analysis allow only to indirectly estimate the process of reading and understanding a text. Therefore, the aim of this work is to develop a valid method of recording objective indicators in real time for assessing the fact of reading and the degree of text comprehension. Psychophysiological parameters recorded during text reading can form the basis for this objective method. We studied the relationship between multimodal psychophysiological parameters and the process of text comprehension during reading using the method of correlation analysis. We used eye-tracking technology to record eye movements parameters to estimate visual attention, electroencephalography (EEG) to assess cognitive load and polygraphic indicators (skin-galvanic reaction, SGR) that reflect the emotional state of the respondent during text reading. We revealed reliable interrelations between perceiving the information and the dynamics of psychophysiological parameters during reading the text in commercials. Eye movement parameters reflected the difficulties arising in respondents during perceiving ambiguous parts of text. EEG dynamics in rate of alpha band were related with cumulative effect of cognitive load. SGR dynamics were related with emotional state of the respondent and with the meaning of text and type of commercial. EEG and polygraph parameters together also reflected the mental difficulties of respondents in understanding text and showed significant differences in cases of low and high text comprehension. We also revealed differences in psychophysiological parameters for different type of commercials (static vs. video, financial vs. cinema vs. pharmaceutics vs. mobile communication, etc.). Conclusions: Our methodology allows to perform multimodal evaluation of text perusal and the quality of text reading in commercials. In general, our results indicate the possibility of designing an integral model to estimate the comprehension of reading the commercial text in percent scale based on all noticed markers.

Keywords: reading, commercials, eye movements, EEG, polygraphic indicators

Procedia PDF Downloads 150
17977 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator

Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard

Abstract:

Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.

Keywords: blade tip timing, blisk, finite element, vibration measurement

Procedia PDF Downloads 296
17976 The Effect of Extremely Low Frequency Magnetic Field on Rats Brain

Authors: Omar Abdalla, Abdelfatah Ahmed, Ahmed Mustafa, Abdelazem Eldouma

Abstract:

The purpose of this study is evaluating the effect of extremely low frequency magnetic field on Waster rats brain. The number of rats used in this study were 25, which were divided into five groups, each group containing five rats as follows: Group 1: The control group which was not exposed to energized field; Group 2: Rats were exposed to a magnetic field with an intensity of 0.6 mT (2 hours/day); Group 3: Rats were exposed to a magnetic field of 1.2 mT (2 hours/day); Group4: Rats were exposed to a magnetic field of 1.8 mT (2 hours/day); Group 5: Rats were exposed to a magnetic field of 2.4 mT (2 hours/day) and all groups were exposed for seven days, by designing a maze and calculating the time average for arriving to the decoy at special conditions. We found the time average before exposure for the all groups was G2=330 s, G3=172 s, G4=500 s and G5=174 s, respectively. We exposed all groups to ELF-MF and measured the time and we found: G2=465 s, G3=388 s, G4=501 s, and G5=442 s. It was observed that the time average increased directly with field strength. Histological samples of frontal lop of brain for all groups were taken and we found lesion, atrophy, empty vacuoles and disorder choroid plexus at frontal lope of brain. And finally we observed the disorder of choroid plexus in histological results and Alzheimer's symptoms increase when the magnetic field increases.

Keywords: nonionizing radiation, biophysics, magnetic field, shrinkage

Procedia PDF Downloads 528
17975 Effect of Preloading on Long-Term Settlement of Closed Landfills: A Numerical Analysis

Authors: Mehrnaz Alibeikloo, Hajar Share Isfahani, Hadi Khabbaz

Abstract:

In recent years, by developing cities and increasing population, reconstructing on closed landfill sites in some regions is unavoidable. Long-term settlement is one of the major concerns associated with reconstruction on landfills after closure. The purpose of this research is evaluating the effect of preloading in various patterns of height and time on long-term settlements of closed landfills. In this regard, five scenarios of surcharge from 1 to 3 m high within 3, 4.5 and 6 months of preloading time have been modeled using PLAXIS 2D software. Moreover, the numerical results have been compared to those obtained from analytical methods, and a good agreement has been achieved. The findings indicate that there is a linear relationship between settlement and surcharge height. Although, long-term settlement decreased by applying a longer and higher preloading, the time of preloading was found to be a more effective factor compared to preloading height.

Keywords: preloading, long-term settlement, landfill, PLAXIS 2D

Procedia PDF Downloads 174
17974 Kemmer Oscillator in Cosmic String Background

Authors: N. Messai, A. Boumali

Abstract:

In this work, we aim to solve the two dimensional Kemmer equation including Dirac oscillator interaction term, in the background space-time generated by a cosmic string which is submitted to an uniform magnetic field. Eigenfunctions and eigenvalues of our problem have been found and the influence of the cosmic string space-time on the energy spectrum has been analyzed.

Keywords: Kemmer oscillator, cosmic string, Dirac oscillator, eigenfunctions

Procedia PDF Downloads 570
17973 An Insight into the Paddy Soil Denitrifying Bacteria and Their Relation with Soil Phospholipid Fatty Acid Profile

Authors: Meenakshi Srivastava, A. K. Mishra

Abstract:

This study characterizes the metabolic versatility of denitrifying bacterial communities residing in the paddy soil using the GC-MS based Phospholipid Fatty Acid (PLFA) analyses simultaneously with nosZ gene based PCR-DGGE (Polymerase Chain Reaction-Denaturing Gradient Gel Electrophoresis) and real time Q-PCR analysis. We have analyzed the abundance of nitrous oxide reductase (nosZ) genes, which was subsequently related to soil PLFA profile and DGGE based denitrifier community structure. Soil denitrifying bacterial community comprised majority or dominance of Ochrobactrum sp. following Cupriavidus and uncultured bacteria strains in paddy soil of selected sites. Initially, we have analyzed the abundance of the nitrous oxide reductase gene (nosZ), which was found to be related with PLFA based lipid profile. Chandauli of Eastern UP, India represented greater amount of lipid content (C18-C20) and denitrifier’s diversity. This study suggests the positive co-relation between soil PLFA profiles, DGGE, and Q-PCR data. Thus, a close networking among metabolic abilities and taxonomic composition of soil microbial communities existed, and subsequently, such work at greater extent could be helpful in managing nutrient dynamics as well as microbial dynamics of paddy soil ecosystem.

Keywords: denaturing gradient gel electrophoresis, DGGE, nitrifying and denitrifying bacteria, PLFA, Q-PCR

Procedia PDF Downloads 111
17972 Effects of Boiling Temperature and Time on Colour, Texture and Sensory Properties of Volutharpa ampullacea perryi Meat

Authors: Xianbao Sun, Jinlong Zhao, Shudong He, Jing Li

Abstract:

Volutharpa ampullacea perryi is a high-protein marine shellfish. However, few data are available on the effects of boiling temperatures and time on quality of the meat. In this study, colour, texture and sensory characteristics of Volutharpa ampullacea perryi meat during the boiling cooking processes (75-100 °C, 5-60 min) were investigated by colors analysis, texture profile analysis (TPA), scanning electron microscope (SEM) and sensory evaluation. The ratio of cooking loss gradually increased with the increase of temperature and time. The colour of meat became lighter and more yellower from 85 °C to 95 °C in a short time (5-20 min), but it became brown after a 30 min treatment. TPA results showed that the Volutharpa ampullacea perryi meat were more firm and less cohesive after a higher temperature (95-100 °C) treatment even in a short period (5-15 min). Based on the SEM analysis, it was easily found that the myofibrils structure was destroyed at a higher temperature (85-100 °C). Sensory data revealed that the meat cooked at 85-90 °C in 10-20 min showed higher scores in overall acceptance, as well as color, hardness and taste. Based on these results, it could be constructed that Volutharpa ampullacea perryi meat should be heated on a suitable condition (such as 85 °C 15 min or 90 °C 10 min) in the boiling cooking to be ensure a better acceptability.

Keywords: Volutharpa ampullacea perryi meat, boiling cooking, colour, sensory, texture

Procedia PDF Downloads 263
17971 Engineered Control of Bacterial Cell-to-Cell Signaling Using Cyclodextrin

Authors: Yuriko Takayama, Norihiro Kato

Abstract:

Quorum sensing (QS) is a cell-to-cell communication system in bacteria to regulate expression of target genes. In gram-negative bacteria, activation on QS is controlled by a concentration increase of N-acylhomoserine lactone (AHL), which can diffuse in and out of the cell. Effective control of QS is expected to avoid virulence factor production in infectious pathogens, biofilm formation, and antibiotic production because various cell functions in gram-negative bacteria are controlled by AHL-mediated QS. In this research, we applied cyclodextrins (CDs) as artificial hosts for the AHL signal to reduce the AHL concentration in the culture broth below its threshold for QS activation. The AHL-receptor complex induced under the high AHL concentration activates transcription of the QS-target gene. Accordingly, artificial reduction of the AHL concentration is one of the effective strategies to inhibit the QS. A hydrophobic cavity of the CD can interact with the acyl-chain of the AHL due to hydrophobic interaction in aqueous media. We studied N-hexanoylhomoserine lactone (C6HSL)-mediated QS in Serratia marcescens; accumulation of C6HSL is responsible for regulation of the expression of pig cluster. Inhibitory effects of added CDs on QS were demonstrated by determination of prodigiosin amount inside cells after reaching stationary phase, because production of prodigiosin depends on the C6HSL-mediated QS. By adding approximately 6 wt% hydroxypropyl-β-CD (HP-β-CD) in Luria-Bertani (LB) medium prior to inoculation of S. maecescens AS-1, the intracellularly accumulated prodigiosin was drastically reduced to 7-10%, which was determined after the extraction of prodigiosin in acidified ethanol. The AHL retention ability of HP-β-CD was also demonstrated by Chromobacterium violacuem CV026 bioassay. The CV026 strain is an AHL-synthase defective mutant that activates QS solely by adding AHLs from outside of cells. A purple pigment violacein is induced by activation of the AHL-mediated QS. We demonstrated that the violacein production was effectively suppressed when the C6HSL standard solution was spotted on a LB agar plate dispersing CV026 cells and HP-β-CD. Physico-chemical analysis was performed to study the affinity between the immobilized CD and added C6HSL using a quartz crystal microbalance (QCM) sensor. The COOH-terminated self-assembled monolayer was prepared on a gold electrode of 27-MHz AT-cut quartz crystal. Mono(6-deoxy-6-N, N-diethylamino)-β-CD was immobilized on the electrode using water-soluble carbodiimide. The C6HSL interaction with the β-CD cavity was studied by injecting the C6HSL solution to a cup-type sensor cell filled with buffer solution. A decrement of resonant frequency (ΔFs) clearly showed the effective C6HSL complexation with immobilized β-CD and its stability constant for MBP-SpnR-C6HSL complex was on the order of 102 M-1. The CD has high potential for engineered control of QS because it is safe for human use.

Keywords: acylhomoserine lactone, cyclodextrin, intracellular signaling, quorum sensing

Procedia PDF Downloads 221
17970 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique

Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki

Abstract:

Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.

Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector

Procedia PDF Downloads 316
17969 Bank, Stock Market Efficiency and Economic Growth: Lessons for ASEAN-5

Authors: Tan Swee Liang

Abstract:

This paper estimates bank and stock market efficiency associations with real per capita GDP growth by examining panel-data across three different regions using Panel-Corrected Standard Errors (PCSE) regression developed by Beck and Katz (1995). Data from five economies in ASEAN (Singapore, Malaysia, Thailand, Philippines, and Indonesia), five economies in Asia (Japan, China, Hong Kong SAR, South Korea, and India) and seven economies in OECD (Australia, Canada, Denmark, Norway, Sweden, United Kingdom U.K., and United States U.S.), between 1990 and 2017 are used. Empirical findings suggest one, for Asia-5 high bank net interest margin means greater bank profitability, hence spurring economic growth. Two, for OECD-7 low bank overhead costs (as a share of total assets) may reflect weak competition and weak investment in providing superior banking services, hence dampening economic growth. Three, stock market turnover ratio has negative association with OECD-7 economic growth, but a positive association with Asia-5, which suggest the relationship between liquidity and growth is ambiguous. Lastly, for ASEAN-5 high bank overhead costs (as a share of total assets) may suggest expenses have not been channelled efficiently to income generating activities. One practical implication of the findings is that policy makers should take necessary measures toward financial liberalisation policies that boost growth through the efficiency channel, so that funds are efficiently allocated through the financial system between financial and real sectors.

Keywords: financial development, banking system, capital markets, economic growth

Procedia PDF Downloads 122
17968 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem

Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi

Abstract:

In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.

Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm

Procedia PDF Downloads 178
17967 Quick Covering Machine for Grain Drying Pavement

Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug

Abstract:

In sundrying, the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack, conducting partial budget, and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0 .53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.

Keywords: quick, covering machine, grain, drying pavement

Procedia PDF Downloads 358
17966 Analysis of Silicon Controlled Rectifier-Based Electrostatic Discharge Protection Circuits with Electrical Characteristics for the 5V Power Clamp

Authors: Jun-Geol Park, Kyoung-Il Do, Min-Ju Kwon, Kyung-Hyun Park, Yong-Seo Koo

Abstract:

This paper analyzed the SCR (Silicon Controlled Rectifier)-based ESD (Electrostatic Discharge) protection circuits with the turn-on time characteristics. The structures are the LVTSCR (Low Voltage Triggered SCR), the ZTSCR (Zener Triggered SCR) and the PTSCR (P-Substrate Triggered SCR). The three structures are for the 5V power clamp. In general, the structures with the low trigger voltage structure can have the fast turn-on characteristics than other structures. All the ESD protection circuits have the low trigger voltage by using the N+ bridge region of LVTSCR, by using the zener diode structure of ZTSCR, by increasing the trigger current of PTSCR. The simulation for the comparison with the turn-on time was conducted by the Synopsys TCAD simulator. As the simulation results, the LVTSCR has the turn-on time of 2.8 ns, ZTSCR of 2.1 ns and the PTSCR of 2.4 ns. The HBM simulation results, however, show that the PTSCR is the more robust structure of 430K in HBM 8kV standard than 450K of LVTSCR and 495K of ZTSCR. Therefore the PTSCR is the most effective ESD protection circuit for the 5V power clamp.

Keywords: ESD, SCR, turn-on time, trigger voltage, power clamp

Procedia PDF Downloads 330
17965 Exergy Analysis of a Green Dimethyl Ether Production Plant

Authors: Marcello De Falco, Gianluca Natrella, Mauro Capocelli

Abstract:

CO₂ capture and utilization (CCU) is a promising approach to reduce GHG(greenhouse gas) emissions. Many technologies in this field are recently attracting attention. However, since CO₂ is a very stable compound, its utilization as a reagent is energetic intensive. As a consequence, it is unclear whether CCU processes allow for a net reduction of environmental impacts from a life cycle perspective and whether these solutions are sustainable. Among the tools to apply for the quantification of the real environmental benefits of CCU technologies, exergy analysis is the most rigorous from a scientific point of view. The exergy of a system is the maximum obtainable work during a process that brings the system into equilibrium with its reference environment through a series of reversible processes in which the system can only interact with such an environment. In other words, exergy is an “opportunity for doing work” and, in real processes, it is destroyed by entropy generation. The exergy-based analysis is useful to evaluate the thermodynamic inefficiencies of processes, to understand and locate the main consumption of fuels or primary energy, to provide an instrument for comparison among different process configurations and to detect solutions to reduce the energy penalties of a process. In this work, the exergy analysis of a process for the production of Dimethyl Ether (DME) from green hydrogen generated through an electrolysis unit and pure CO₂ captured from flue gas is performed. The model simulates the behavior of all units composing the plant (electrolyzer, carbon capture section, DME synthesis reactor, purification step), with the scope to quantify the performance indices based on the II Law of Thermodynamics and to identify the entropy generation points. Then, a plant optimization strategy is proposed to maximize the exergy efficiency.

Keywords: green DME production, exergy analysis, energy penalties, exergy efficiency

Procedia PDF Downloads 228
17964 Geo-Collaboration Model between a City and Its Inhabitants to Develop Complementary Solutions for Better Household Waste Collection

Authors: Abdessalam Hijab, Hafida Boulekbache, Eric Henry

Abstract:

According to several research studies, the city as a whole is a complex, spatially organized system; its modeling must take into account several factors, socio-economic, and political, or geographical, acting at multiple scales of observation according to varied temporalities. Sustainable management and protection of the environment in this complex system require significant human and technical investment, particularly for monitoring and maintenance. The objective of this paper is to propose an intelligent approach based on the coupling of Geographic Information System (GIS) and Information and Communications Technology (ICT) tools in order to integrate the inhabitants in the processes of sustainable management and protection of the urban environment, specifically in the processes of household waste collection in urban areas. We are discussing a collaborative 'city/inhabitant' space. Indeed, it is a geo-collaborative approach, based on the spatialization and real-time geo-localization of topological and multimedia data taken by the 'active' inhabitant, in the form of geo-localized alerts related to household waste issues in their city. Our proposal provides a good understanding of the extent to which civil society (inhabitants) can help and contribute to the development of complementary solutions for the collection of household waste and the protection of the urban environment. Moreover, it allows the inhabitant to contribute to the enrichment of a data bank for future uses. Our geo-collaborative model will be tested in the Lamkansa sampling district of the city of Casablanca in Morocco.

Keywords: geographic information system, GIS, information and communications technology, ICT, geo-collaboration, inhabitants, city

Procedia PDF Downloads 98
17963 Exploring Digital Media’s Impact on Sports Sponsorship: A Global Perspective

Authors: Sylvia Chan-Olmsted, Lisa-Charlotte Wolter

Abstract:

With the continuous proliferation of media platforms, there have been tremendous changes in media consumption behaviors. From the perspective of sports sponsorship, while there is now a multitude of platforms to create brand associations, the changing media landscape and shift of message control also mean that sports sponsors will have to take into account the nature of and consumer responses toward these emerging digital media to devise effective marketing strategies. Utilizing the personal interview methodology, this study is qualitative and exploratory in nature. A total of 18 experts from European and American academics, sports marketing industry, and sports leagues/teams were interviewed to address three main research questions: 1) What are the major changes in digital technologies that are relevant to sports sponsorship; 2) How have digital media influenced the channels and platforms of sports sponsorship; and 3) How have these technologies affected the goals, strategies, and measurement of sports sponsorship. The study found that sports sponsorship has moved from consumer engagement, engagement measurement, and consequences of engagement on brand behaviors to micro-targeting one on one, engagement by context, time, and space, and activation and leveraging based on tracking and databases. From the perspective of platforms and channels, the use of mobile devices is prominent during sports content consumption. Increasing multiscreen media consumption means that sports sponsors need to optimize their investment decisions in leagues, teams, or game-related content sources, as they need to go where the fans are most engaged in. The study observed an imbalanced strategic leveraging of technology and digital infrastructure. While sports leagues have had less emphasis on brand value management via technology, sports sponsors have been much more active in utilizing technologies like mobile/LBS tools, big data/user info, real-time marketing and programmatic, and social media activation. Regardless of the new media/platforms, the study found that integration and contextualization are the two essential means of improving sports sponsorship effectiveness through technology. That is, how sponsors effectively integrate social media/mobile/second screen into their existing legacy media sponsorship plan so technology works for the experience/message instead of distracting fans. Additionally, technological advancement and attention economy amplify the importance of consumer data gathering, but sports consumer data does not mean loyalty or engagement. This study also affirms the benefit of digital media as they offer viral and pre-event activations through storytelling way before the actual event, which is critical for leveraging brand association before and after. That is, sponsors now have multiple opportunities and platforms to tell stories about their brands for longer time period. In summary, digital media facilitate fan experience, access to the brand message, multiplatform/channel presentations, storytelling, and content sharing. Nevertheless, rather than focusing on technology and media, today’s sponsors need to define what they want to focus on in terms of content themes that connect with their brands and then identify the channels/platforms. The big challenge for sponsors is to play to the venues/media’s specificity and its fit with the target audience and not uniformly deliver the same message in the same format on different platforms/channels.

Keywords: digital media, mobile media, social media, technology, sports sponsorship

Procedia PDF Downloads 282
17962 Comparative Effects of Resveratrol and Energy Restriction on Liver Fat Accumulation and Hepatic Fatty Acid Oxidation

Authors: Iñaki Milton-Laskibar, Leixuri Aguirre, Maria P. Portillo

Abstract:

Introduction: Energy restriction is an effective approach in preventing liver steatosis. However, due to social and economic reasons among others, compliance with this treatment protocol is often very poor, especially in the long term. Resveratrol, a natural polyphenolic compound that belongs to stilbene group, has been widely reported to imitate the effects of energy restriction. Objective: To analyze the effects of resveratrol under normoenergetic feeding conditions and under a mild energy restriction on liver fat accumulation and hepatic fatty acid oxidation. Methods: 36 male six-week-old rats were fed a high-fat high-sucrose diet for 6 weeks in order to induce steatosis. Then, rats were divided into four groups and fed a standard diet for 6 additional weeks: control group (C), resveratrol group (RSV, resveratrol 30 mg/kg/d), restricted group (R, 15 % energy restriction) and combined group (RR, 15 % energy restriction and resveratrol 30 mg/kg/d). Liver triacylglycerols (TG) and total cholesterol contents were measured by using commercial kits. Carnitine palmitoyl transferase 1a (CPT 1a) and citrate synthase (CS) activities were measured spectrophotometrically. TFAM (mitochondrial transcription factor A) and peroxisome proliferator-activator receptor alpha (PPARα) protein contents, as well as the ratio acetylated peroxisome proliferator-activated receptor gamma coactivator 1-alpha (PGC1α)/Total PGC1α were analyzed by Western blot. Statistical analysis was performed by using one way ANOVA and Newman-Keuls as post-hoc test. Results: No differences were observed among the four groups regarding liver weight and cholesterol content, but the three treated groups showed reduced TG when compared to the control group, being the restricted groups the ones showing the lowest values (with no differences between them). Higher CPT 1a and CS activities were observed in the groups supplemented with resveratrol (RSV and RR), with no difference between them. The acetylated PGC1α /total PGC1α ratio was lower in the treated groups (RSV, R and RR) than in the control group, with no differences among them. As far as TFAM protein expression is concerned, only the RR group reached a higher value. Finally, no changes were observed in PPARα protein expression. Conclusions: Resveratrol administration is an effective intervention for liver triacylglycerol content reduction, but a mild energy restriction is even more effective. The mechanisms of action of these two strategies are different. Thus resveratrol, but not energy restriction, seems to act by increasing fatty acid oxidation, although mitochondriogenesis seems not to be induced. When both treatments (resveratrol administration and a mild energy restriction) were combined, no additive or synergic effects were appreciated. Acknowledgements: MINECO-FEDER (AGL2015-65719-R), Basque Government (IT-572-13), University of the Basque Country (ELDUNANOTEK UFI11/32), Institut of Health Carlos III (CIBERobn). Iñaki Milton is a fellowship from the Basque Government.

Keywords: energy restriction, fat, liver, oxidation, resveratrol

Procedia PDF Downloads 200
17961 Synthesis and in-vitro Evaluation of Quinozolines as Potent EGFR Inhibitor

Authors: Vinaya Kambappa, Chinnadurai Mani, Komaraiah Palle

Abstract:

Non-small cell-lung cancer (NSCLC) cells have increased expression of EGFR, which makes them a potential target for cancer therapy. Based on molecular docking and previous reports, we designed and synthesized quinazoline derivatives as potent EGFR inhibitors. Among the derivatives, three compounds showed good antiproliferative activity against A-549 and H-1299 cells. Furthermore, these compounds inhibited EGFR signaling exhibiting diminishing p-EGFR and its downstream proteins like p-Akt, p-Erk1/2, and p-mTOR; however, it did not alter the levels of EGFR, Akt, Erk1/2 and mTOR proteins. Flow cytometric analysis indicated the accumulation of cells at G1 phase suggesting induction of apoptosis, which was further confirmed by annexin V/propidium iodide staining. Our study suggested that quinazoline scaffold can be developed as novel EGFR kinase inhibitors for cancer therapy.

Keywords: apoptosis, non-small cell-lung cancer cells, EGFR, quinazoline

Procedia PDF Downloads 171
17960 Consequences of Transformation of Modern Monetary Policy during the Global Financial Crisis

Authors: Aleksandra Szunke

Abstract:

Monetary policy is an important pillar of the economy, directly affecting on the condition of banking sector. Depending on the strategy may both support functioning of banking institutions, as well as limit their excessively risky activities. The literature studies indicate a large number of publications, which include characteristics of initiatives, implemented by central banks during the global financial crisis and the potential effects of the use of non-standard monetary policy instruments. However, the empirical evidence about their effects and real consequences for the financial markets are still not final. Even before the escalation of instability, Bernanke, Reinhart, and Sack (2004) analyzed the effectiveness of various unconventional monetary tools in lowering long-term interest rates in the United States and Japan. The obtained results largely confirmed the effectiveness of the zero-interest-rate policy and Quantitative Easing (QE) in achieving the goal of reducing long-term interest rates. Japan, considered as the precursor of QE policy, also conducted research about the consequences of non-standard instruments, implemented to restore financial stability of the country. Although the literature about the effectiveness of Quantitative Easing in Japan is extensive, it does not uniquely specify whether it brought permanent effects. The main aim of the study is to identify the implications of non-standard monetary policy, implemented by selected central banks (the Federal Reserve System, Bank of England and European Central Bank), paying particular attention to the consequences into three areas: the size of money supply, financial markets, and the real economy.

Keywords: consequences of modern monetary policy, quantitative easing policy, banking sector instability, global financial crisis

Procedia PDF Downloads 460
17959 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics

Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo

Abstract:

Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.

Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model

Procedia PDF Downloads 136
17958 Impact of Artificial Intelligence Technologies on Information-Seeking Behaviors and the Need for a New Information Seeking Model

Authors: Mohammed Nasser Al-Suqri

Abstract:

Former information-seeking models are proposed more than two decades ago. These already existed models were given prior to the evolution of digital information era and Artificial Intelligence (AI) technologies. Lack of current information seeking models within Library and Information Studies resulted in fewer advancements for teaching students about information-seeking behaviors, design of library tools and services. In order to better facilitate the aforementioned concerns, this study aims to propose state-of-the-art model while focusing on the information seeking behavior of library users in the Sultanate of Oman. This study aims for the development, designing and contextualizing the real-time user-centric information seeking model capable of enhancing information needs and information usage along with incorporating critical insights for the digital library practices. Another aim is to establish far-sighted and state-of-the-art frame of reference covering Artificial Intelligence (AI) while synthesizing digital resources and information for optimizing information-seeking behavior. The proposed study is empirically designed based on a mix-method process flow, technical surveys, in-depth interviews, focus groups evaluations and stakeholder investigations. The study data pool is consist of users and specialist LIS staff at 4 public libraries and 26 academic libraries in Oman. The designed research model is expected to facilitate LIS by assisting multi-dimensional insights with AI integration for redefining the information-seeking process, and developing a technology rich model.

Keywords: artificial intelligence, information seeking, information behavior, information seeking models, libraries, Sultanate of Oman

Procedia PDF Downloads 101
17957 Internet of Things for Smart Dedicated Outdoor Air System in Buildings

Authors: Dararat Tongdee, Surapong Chirarattananon, Somchai Maneewan, Chantana Punlek

Abstract:

Recently, the Internet of Things (IoT) is the important technology that connects devices to the network and people can access real-time communication. This technology is used to report, collect, and analyze the big data for achieving a purpose. For a smart building, there are many IoT technologies that enable management and building operators to improve occupant thermal comfort, indoor air quality, and building energy efficiency. In this research, we propose monitoring and controlling performance of a smart dedicated outdoor air system (SDOAS) based on IoT platform. The SDOAS was specifically designed with the desiccant unit and thermoelectric module. The designed system was intended to monitor, notify, and control indoor environmental factors such as temperature, humidity, and carbon dioxide (CO₂) level. The SDOAS was tested under the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE 62.2) and indoor air quality standard. The system will notify the user by Blynk notification when the status of the building is uncomfortable or tolerable limits are reached according to the conditions that were set. The user can then control the system via a Blynk application on a smartphone. The experimental result indicates that the temperature and humidity of indoor fresh air in the comfort zone are approximately 26 degree Celsius and 58% respectively. Furthermore, the CO₂ level was controlled lower than 1000 ppm by indoor air quality standard condition. Therefore, the proposed system can efficiently work and be easy to use for buildings.

Keywords: internet of things, indoor air quality, smart dedicated outdoor air system, thermal comfort

Procedia PDF Downloads 185
17956 Monitorization of Junction Temperature Using a Thermal-Test-Device

Authors: B. Arzhanov, A. Correia, P. Delgado, J. Meireles

Abstract:

Due to the higher power loss levels in electronic components, the thermal design of PCBs (Printed Circuit Boards) of an assembled device becomes one of the most important quality factors in electronics. Nonetheless, some of leading causes of the microelectronic component failures are due to higher temperatures, the leakages or thermal-mechanical stress, which is a concern, is the reliability of microelectronic packages. This article presents an experimental approach to measure the junction temperature of exposed pad packages. The implemented solution is in a prototype phase, using a temperature-sensitive parameter (TSP) to measure temperature directly on the die, validating the numeric results provided by the Mechanical APDL (Ansys Parametric Design Language) under same conditions. The physical device-under-test is composed by a Thermal Test Chip (TTC-1002) and assembly in a QFN cavity, soldered to a test-board according to JEDEC Standards. Monitoring the voltage drop across a forward-biased diode, is an indirectly method but accurate to obtain the junction temperature of QFN component with an applied power range between 0,3W to 1.5W. The temperature distributions on the PCB test-board and QFN cavity surface were monitored by an infra-red thermal camera (Goby-384) controlled and images processed by the Xeneth software. The article provides a set-up to monitorize in real-time the junction temperature of ICs, namely devices with the exposed pad package (i.e. QFN). Presenting the PCB layout parameters that the designer should use to improve thermal performance, and evaluate the impact of voids in solder interface in the device junction temperature.

Keywords: quad flat no-Lead packages, exposed pads, junction temperature, thermal management and measurements

Procedia PDF Downloads 274
17955 The Underestimation of Cultural Risk in the Execution of Megaprojects

Authors: Alan Walsh, Peter Walker, Michael Ellis

Abstract:

There is a real danger that both practitioners and researchers considering risks associated with megaprojects ignore or underestimate the impacts of cultural risk. The paper investigates the potential impacts of a failure to achieve cultural unity between the principal actors executing a megaproject. The principle relationships include the relationships between the principle Contractors and the project stakeholders or the project stakeholders and their principle advisors, Western Consultants. This study confirms that cultural dissonance between these parties can delay or disrupt the megaproject execution and examines why cultural issues should be prioritized as a significant risk factor in megaproject delivery. This paper addresses the practical impacts and potential mitigation measures, which may reduce cultural dissonance for a megaproject's delivery. This information is retrieved from on-going case studies in live infrastructure megaprojects in Europe and the Middle East's GCC states, from Western Consultants' perspective. The collaborating researchers each have at least 30 years of construction experience and are engaged in architecture, project management and contracts management, dealing with megaprojects in Europe or the GCC. After examining the cultural interfaces they have observed during the execution of megaprojects, they conclude that globally, culture significantly influences their efficient delivery. The study finds that cultural risk is ever-present, where different nationalities co-manage megaprojects and that cultural conflict poses a real threat to the timely delivery of megaprojects. The study indicates that the higher the cultural distance between the principal actors, the more pronounced the risk, with the risk of cultural dissonance more prominent in GCC megaprojects. The findings support a more culturally aware and cohesive team approach and recommend cross-cultural training to mitigate the effects of cultural disparity.

Keywords: cultural risk underestimation, cultural distance, megaproject characteristics, megaproject execution

Procedia PDF Downloads 96