Search results for: spare coding
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 616

Search results for: spare coding

586 Spare Part Carbon Footprint Reduction with Reman Applications

Authors: Enes Huylu, Sude Erkin, Nur A. Özdemir, Hatice K. Güney, Cemre S. Atılgan, Hüseyin Y. Altıntaş, Aysemin Top, Muammer Yılman, Özak Durmuş

Abstract:

Remanufacturing (reman) applications allow manufacturers to contribute to the circular economy and help to introduce products with almost the same quality, environment-friendly, and lower cost. The objective of this study is to present that the carbon footprint of automotive spare parts used in vehicles could be reduced by reman applications based on Life Cycle Analysis which was framed with ISO 14040 principles. In that case, it was aimed to investigate reman applications for 21 parts in total. So far, research and calculations have been completed for the alternator, turbocharger, starter motor, compressor, manual transmission, auto transmission, and DPF (diesel particulate filter) parts, respectively. Since the aim of Ford Motor Company and Ford OTOSAN is to achieve net zero based on Science-Based Targets (SBT) and the Green Deal that the European Union sets out to make it climate neutral by 2050, the effects of reman applications are researched. In this case, firstly, remanufacturing articles available in the literature were searched based on the yearly high volume of spare parts sold. Paper review results related to their material composition and emissions released during incoming production and remanufacturing phases, the base part has been selected to take it as a reference. Then, the data of the selected base part from the research are used to make an approximate estimation of the carbon footprint reduction of the relevant part used in Ford OTOSAN. The estimation model is based on the weight, and material composition of the referenced paper reman activity. As a result of this study, it was seen that remanufacturing applications are feasible to apply technically and environmentally since it has significant effects on reducing the emissions released during the production phase of the vehicle components. For this reason, the research and calculations of the total number of targeted products in yearly volume have been completed to a large extent. Thus, based on the targeted parts whose research has been completed, in line with the net zero targets of Ford Motor Company and Ford OTOSAN by 2050, if remanufacturing applications are preferred instead of recent production methods, it is possible to reduce a significant amount of the associated greenhouse gas (GHG) emissions of spare parts used in vehicles. Besides, it is observed that remanufacturing helps to reduce the waste stream and causes less pollution than making products from raw materials by reusing the automotive components.

Keywords: greenhouse gas emissions, net zero targets, remanufacturing, spare parts, sustainability

Procedia PDF Downloads 47
585 Cognitive SATP for Airborne Radar Based on Slow-Time Coding

Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu

Abstract:

Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.

Keywords: space-time adaptive processing (STAP), airborne radar, signal-to-clutter ratio, slow-time coding

Procedia PDF Downloads 243
584 Meteosat Second Generation Image Compression Based on the Radon Transform and Linear Predictive Coding: Comparison and Performance

Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane

Abstract:

Image compression is used to reduce the number of bits required to represent an image. The Meteosat Second Generation satellite (MSG) allows the acquisition of 12 image files every 15 minutes. Which results a large databases sizes. The transform selected in the images compression should contribute to reduce the data representing the images. The Radon transform retrieves the Radon points that represent the sum of the pixels in a given angle for each direction. Linear predictive coding (LPC) with filtering provides a good decorrelation of Radon points using a Predictor constitute by the Symmetric Nearest Neighbor filter (SNN) coefficients, which result losses during decompression. Finally, Run Length Coding (RLC) gives us a high and fixed compression ratio regardless of the input image. In this paper, a novel image compression method based on the Radon transform and linear predictive coding (LPC) for MSG images is proposed. MSG image compression based on the Radon transform and the LPC provides a good compromise between compression and quality of reconstruction. A comparison of our method with other whose two based on DCT and one on DWT bi-orthogonal filtering is evaluated to show the power of the Radon transform in its resistibility against the quantization noise and to evaluate the performance of our method. Evaluation criteria like PSNR and the compression ratio allows showing the efficiency of our method of compression.

Keywords: image compression, radon transform, linear predictive coding (LPC), run lengthcoding (RLC), meteosat second generation (MSG)

Procedia PDF Downloads 388
583 A Qualitative Study to Analyze Clinical Coders’ Decision Making Process of Adverse Drug Event Admissions

Authors: Nisa Mohan

Abstract:

Clinical coding is a feasible method for estimating the national prevalence of adverse drug event (ADE) admissions. However, under-coding of ADE admissions is a limitation of this method. Whilst the under-coding will impact the accurate estimation of the actual burden of ADEs, the feasibility of the coded data in estimating the adverse drug event admissions goes much further compared to the other methods. Therefore, it is necessary to know the reasons for the under-coding in order to improve the clinical coding of ADE admissions. The ability to identify the reasons for the under-coding of ADE admissions rests on understanding the decision-making process of coding ADE admissions. Hence, the current study aimed to explore the decision-making process of clinical coders when coding cases of ADE admissions. Clinical coders from different levels of coding job such as trainee, intermediate and advanced level coders were purposefully selected for the interviews. Thirteen clinical coders were recruited from two Auckland region District Health Board hospitals for the interview study. Semi-structured, one-on-one, face-to-face interviews using open-ended questions were conducted with the selected clinical coders. Interviews were about 20 to 30 minutes long and were audio-recorded with the approval of the participants. The interview data were analysed using a general inductive approach. The interviews with the clinical coders revealed that the coders have targets to meet, and they sometimes hesitate to adhere to the coding standards. Coders deviate from the standard coding processes to make a decision. Coders avoid contacting the doctors for clarifying small doubts such as ADEs and the name of the medications because of the delay in getting a reply from the doctors. They prefer to do some research themselves or take help from their seniors and colleagues for making a decision because they can avoid a long wait to get a reply from the doctors. Coders think of ADE as a small thing. Lack of time for searching for information to confirm an ADE admission, inadequate communication with clinicians, along with coders’ belief that an ADE is a small thing may contribute to the under-coding of the ADE admissions. These findings suggest that further work is needed on interventions to improve the clinical coding of ADE admissions. Providing education to coders about the importance of ADEs, educating clinicians about the importance of clear and confirmed medical records entries, availing pharmacists’ services to improve the detection and clear documentation of ADE admissions, and including a mandatory field in the discharge summary about external causes of diseases may be useful for improving the clinical coding of ADE admissions. The findings of the research will help the policymakers to make informed decisions about the improvements. This study urges the coding policymakers, auditors, and trainers to engage with the unconscious cognitive biases and short-cuts of the clinical coders. This country-specific research conducted in New Zealand may also benefit other countries by providing insight into the clinical coding of ADE admissions and will offer guidance about where to focus changes and improvement initiatives.

Keywords: adverse drug events, clinical coders, decision making, hospital admissions

Procedia PDF Downloads 91
582 Unraveling the Threads of Madness: Henry Russell’s 'The Maniac' as an Advocate for Deinstitutionalization in the Nineteenth Century

Authors: T. J. Laws-Nicola

Abstract:

Henry Russell was best known as a composer of more than 300 songs. Many of his compositions were popular for both their sentimental texts, as in ‘The Old Armchair,’ and those of a more political nature, such as ‘Woodsman, Spare That Tree!’ Indeed, Russell had written such songs of advocacy as those associated with abolitionism (‘The Slave Ship’) and environmentalism (‘Woodsman, Spare that Tree!’). ‘The Maniac’ is his only composition addressing the issue of institutionalization. The text is borrowed and adapted from the monodrama The Captive by M.G. ‘Monk’ Lewis. Through an analysis of form, harmony, melody, text, and thematic development and interactions between text and music we can approach a clearer understanding of ‘The Maniac’ and how the text and music interact. Select periodicals, such as The London Times, provide contemporary critical review for ‘The Maniac.’ Additional nineteenth century songs whose texts focus on madness and/or institutionalization will assist in building a stylistic and cultural context for ‘The Maniac.’ Through comparative analyses of ‘The Maniac’ with a body of songs that focus on similar topics, we can approach a clear understanding of the song as a vehicle for deinstitutionalization.

Keywords: 19th century song, institutionalization, M. G. Lewis, Henry Russell

Procedia PDF Downloads 503
581 Performance Analysis and Comparison of Various 1-D and 2-D Prime Codes for OCDMA Systems

Authors: Gurjit Kaur, Shashank Johri, Arpit Mehrotra

Abstract:

In this paper we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension i.e. time slots whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for OCDMA system on a single platform. Results shows that 1D Extended Prime Code (EPC) can support more number of active users compared to other codes but at the expense of larger code length which further increases the complexity of the code. Modified Prime Code (MPC) supports lesser number of active users at λc=2 but it has a lesser code length as compared to 1D prime code. Analysis shows that 2D prime code supports lesser number of active users than 1D codes but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.

Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa

Procedia PDF Downloads 479
580 Secure Network Coding-Based Named Data Network Mutual Anonymity Transfer Protocol

Authors: Tao Feng, Fei Xing, Ye Lu, Jun Li Fang

Abstract:

NDN is a kind of future Internet architecture. Due to the NDN design introduces four privacy challenges,Many research institutions began to care about the privacy issues of naming data network(NDN).In this paper, we are in view of the major NDN’s privacy issues to investigate privacy protection,then put forwards more effectively anonymous transfer policy for NDN.Firstly,based on mutual anonymity communication for MP2P networks,we propose NDN mutual anonymity protocol.Secondly,we add interest package authentication mechanism in the protocol and encrypt the coding coefficient, security of this protocol is improved by this way.Finally, we proof the proposed anonymous transfer protocol security and anonymity.

Keywords: NDN, mutual anonymity, anonymous routing, network coding, authentication mechanism

Procedia PDF Downloads 417
579 Predictive Modelling Approach to Identify Spare Parts Inventory Obsolescence

Authors: Madhu Babu Cherukuri, Tamoghna Ghosh

Abstract:

Factory supply chain management spends billions of dollars every year to procure and manage equipment spare parts. Due to technology -and processes changes some of these spares become obsolete/dead inventory. Factories have huge dead inventory worth millions of dollars accumulating over time. This is due to lack of a scientific methodology to identify them and send the inventory back to the suppliers on a timely basis. The standard approach followed across industries to deal with this is: if a part is not used for a set pre-defined period of time it is declared dead. This leads to accumulation of dead parts over time and these parts cannot be sold back to the suppliers as it is too late as per contract agreement. Our main idea is the time period for identifying a part as dead cannot be a fixed pre-defined duration across all parts. Rather, it should depend on various properties of the part like historical consumption pattern, type of part, how many machines it is being used in, whether it- is a preventive maintenance part etc. We have designed a predictive algorithm which predicts part obsolescence well in advance with reasonable accuracy and which can help save millions.

Keywords: obsolete inventory, machine learning, big data, supply chain analytics, dead inventory

Procedia PDF Downloads 289
578 A Study on Using Network Coding for Packet Transmissions in Wireless Sensor Networks

Authors: Rei-Heng Cheng, Wen-Pinn Fang

Abstract:

A wireless sensor network (WSN) is composed by a large number of sensors and one or a few base stations, where the sensor is responsible for detecting specific event information, which is sent back to the base station(s). However, how to save electricity consumption to extend the network lifetime is a problem that cannot be ignored in the wireless sensor networks. Since the sensor network is used to monitor a region or specific events, how the information can be reliably sent back to the base station is surly important. Network coding technique is often used to enhance the reliability of the network transmission. When a node needs to send out M data packets, it encodes these data with redundant data and sends out totally M + R packets. If the receiver can get any M packets out from these M + R packets, it can decode and get the original M data packets. To transmit redundant packets will certainly result in the excess energy consumption. This paper will explore relationship between the quality of wireless transmission and the number of redundant packets. Hopefully, each sensor can overhear the nearby transmissions, learn the wireless transmission quality around it, and dynamically determine the number of redundant packets used in network coding.

Keywords: energy consumption, network coding, transmission reliability, wireless sensor networks

Procedia PDF Downloads 362
577 Reliability of Clinical Coding in Accurately Estimating the Actual Prevalence of Adverse Drug Event Admissions

Authors: Nisa Mohan

Abstract:

Adverse drug event (ADE) related hospital admissions are common among older people. The first step in prevention is accurately estimating the prevalence of ADE admissions. Clinical coding is an efficient method to estimate the prevalence of ADE admissions. The objective of the study is to estimate the rate of under-coding of ADE admissions in older people in New Zealand and to explore how clinical coders decide whether or not to code an admission as an ADE. There has not been any research in New Zealand to explore these areas. This study is done using a mixed-methods approach. Two common and serious ADEs in older people, namely bleeding and hypoglycaemia were selected for the study. In study 1, eight hundred medical records of people aged 65 years and above who are admitted to hospital due to bleeding and hypoglycemia during the years 2015 – 2016 were selected for quantitative retrospective medical records review. This selection was made to estimate the proportion of ADE-related bleeding and hypoglycemia admissions that are not coded as ADEs. These files were reviewed and recorded as to whether the admission was caused by an ADE. The hospital discharge data were reviewed to check whether all the ADE admissions identified in the records review were coded as ADEs, and the proportion of under-coding of ADE admissions was estimated. In study 2, thirteen clinical coders were selected to conduct qualitative semi-structured interviews using a general inductive approach. Participants were selected purposively based on their experience in clinical coding. Interview questions were designed in a way to investigate the reasons for the under-coding of ADE admissions. The records review study showed that 35% (Cl 28% - 44%) of the ADE-related bleeding admissions and 22% of the ADE-related hypoglycemia admissions were not coded as ADEs. Although the quality of clinical coding is high across New Zealand, a substantial proportion of ADE admissions were under-coded. This shows that clinical coding might under-estimate the actual prevalence of ADE related hospital admissions in New Zealand. The interviews with the clinical coders added that lack of time for searching for information to confirm an ADE admission, inadequate communication with clinicians, along with coders’ belief that an ADE is a small thing might be the potential reasons for the under-coding of the ADE admissions. This study urges the coding policymakers, auditors, and trainers to engage with the unconscious cognitive biases and short-cuts of the clinical coders. These results highlight that further work is needed on interventions to improve the clinical coding of ADE admissions, such as providing education to coders about the importance of ADEs, education to clinicians about the importance of clear and confirmed medical records entries, availing pharmacist service to improve the detection and clear documentation of ADE admissions and including a mandatory field in the discharge summary about external causes of diseases.

Keywords: adverse drug events, bleeding, clinical coders, clinical coding, hypoglycemia

Procedia PDF Downloads 106
576 Analysis of Cooperative Hybrid ARQ with Adaptive Modulation and Coding on a Correlated Fading Channel Environment

Authors: Ibrahim Ozkan

Abstract:

In this study, a cross-layer design which combines adaptive modulation and coding (AMC) and hybrid automatic repeat request (HARQ) techniques for a cooperative wireless network is investigated analytically. Previous analyses of such systems in the literature are confined to the case where the fading channel is independent at each retransmission, which can be unrealistic unless the channel is varying very fast. On the other hand, temporal channel correlation can have a significant impact on the performance of HARQ systems. In this study, utilizing a Markov channel model which accounts for the temporal correlation, the performance of non-cooperative and cooperative networks are investigated in terms of packet loss rate and throughput metrics for Chase combining HARQ strategy.

Keywords: cooperative network, adaptive modulation and coding, hybrid ARQ, correlated fading

Procedia PDF Downloads 110
575 Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding

Authors: Seongsoo Lee

Abstract:

Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.

Keywords: motion estimation, test zone search, high efficiency video coding, processing element, optimization

Procedia PDF Downloads 333
574 Strategy of Inventory Analysis with Economic Order Quantity and Quick Response: Case on Filter Inventory for Heavy Equipment in Indonesia

Authors: Lim Sanny, Felix Christian

Abstract:

The use of heavy equipment in Indonesia is always increasing. Cost reduction in procurement of spare parts is the aim of the company. The spare parts in this research are focused in the kind of filters. On the early step, the choosing of priority filter will be studied further by using the ABC analysis. To find out future demand of the filter, this research is using demand forecast by utilizing the QM software for windows. And to find out the best method of inventory control for each kind of filter is by comparing the total cost of Economic Order Quantity and Quick response inventory method. For the three kind of filters which are Cartridge, Engine oil – pn : 600-211-123, Element, Transmission – pn : 424-16-11140, and Element, Hydraulic – pn : 07063-01054, the best forecasting method is Linear regression. The best method for inventory control of Cartridge, Engine oil – pn : 600-211-123 and Element, Transmission – pn : 424-16-11140, is Quick Response Inventory, while the best method for Element, Hydraulic – pn : 07063-01054 is Economic Order Quantity.

Keywords: strategy, inventory, ABC analysis, forecasting, economic order quantity, quick response inventory

Procedia PDF Downloads 340
573 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter

Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi

Abstract:

In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.

Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm

Procedia PDF Downloads 364
572 An Improvement of ComiR Algorithm for MicroRNA Target Prediction by Exploiting Coding Region Sequences of mRNAs

Authors: Giorgio Bertolazzi, Panayiotis Benos, Michele Tumminello, Claudia Coronnello

Abstract:

MicroRNAs are small non-coding RNAs that post-transcriptionally regulate the expression levels of messenger RNAs. MicroRNA regulation activity depends on the recognition of binding sites located on mRNA molecules. ComiR (Combinatorial miRNA targeting) is a user friendly web tool realized to predict the targets of a set of microRNAs, starting from their expression profile. ComiR incorporates miRNA expression in a thermodynamic binding model, and it associates each gene with the probability of being a target of a set of miRNAs. ComiR algorithms were trained with the information regarding binding sites in the 3’UTR region, by using a reliable dataset containing the targets of endogenously expressed microRNA in D. melanogaster S2 cells. This dataset was obtained by comparing the results from two different experimental approaches, i.e., inhibition, and immunoprecipitation of the AGO1 protein; this protein is a component of the microRNA induced silencing complex. In this work, we tested whether including coding region binding sites in the ComiR algorithm improves the performance of the tool in predicting microRNA targets. We focused the analysis on the D. melanogaster species and updated the ComiR underlying database with the currently available releases of mRNA and microRNA sequences. As a result, we find that the ComiR algorithm trained with the information related to the coding regions is more efficient in predicting the microRNA targets, with respect to the algorithm trained with 3’utr information. On the other hand, we show that 3’utr based predictions can be seen as complementary to the coding region based predictions, which suggests that both predictions, from 3'UTR and coding regions, should be considered in a comprehensive analysis. Furthermore, we observed that the lists of targets obtained by analyzing data from one experimental approach only, that is, inhibition or immunoprecipitation of AGO1, are not reliable enough to test the performance of our microRNA target prediction algorithm. Further analysis will be conducted to investigate the effectiveness of the tool with data from other species, provided that validated datasets, as obtained from the comparison of RISC proteins inhibition and immunoprecipitation experiments, will be available for the same samples. Finally, we propose to upgrade the existing ComiR web-tool by including the coding region based trained model, available together with the 3’UTR based one.

Keywords: AGO1, coding region, Drosophila melanogaster, microRNA target prediction

Procedia PDF Downloads 412
571 New Efficient Method for Coding Color Images

Authors: Walaa M.Abd-Elhafiez, Wajeb Gharibi

Abstract:

In this paper a novel color image compression technique for efficient storage and delivery of data is proposed. The proposed compression technique started by RGB to YCbCr color transformation process. Secondly, the canny edge detection method is used to classify the blocks into edge and non-edge blocks. Each color component Y, Cb, and Cr compressed by discrete cosine transform (DCT) process, quantizing and coding step by step using adaptive arithmetic coding. Our technique is concerned with the compression ratio, bits per pixel and peak signal to noise ratio, and produce better results than JPEG and more recent published schemes (like, CBDCT-CABS and MHC). The provided experimental results illustrate the proposed technique which is efficient and feasible in terms of compression ratio, bits per pixel and peak signal to noise ratio.

Keywords: image compression, color image, q-coder, quantization, edge-detection

Procedia PDF Downloads 306
570 An Exploratory Study of the Meaning of Life of Delivery Agents of Kolkata

Authors: Soumitri Bag Majumder, Anindita Chaudhuri

Abstract:

This exploratory study delves into the perception of job dignity among delivery agents in Kolkata, focusing on both food and grocery delivery sectors. The rapid expansion of online delivery platforms in India has led to a significant rise in the delivery service industry. Despite its growth, there is a dearth of research addressing the multifaceted challenges faced by delivery agents. This study aims to bridge this gap by shedding light on their experiences. The study’s objectives include exploring the lived experiences of delivery agents, their work-life balance, and their perception of job dignity. Using a qualitative research approach, the study will conduct semi-structured in-depth interviews with a purposive sample of 10 participants from each sector, consisting of individuals with lower socio-economic backgrounds aged between 18 and 35 years. The Three-Layer Coding framework proposed by Charmaz will guide the data analysis process, encompassing open coding, axial coding, and selective coding. Through this method, the study seeks to uncover emergent themes and patterns that illuminate the participants’ perspectives on job dignity, recognition, and the challenges they encounter. By uncovering their perceptions of job dignity and the challenges they face, the research aims to contribute to the well-being of these workers and inform relevant stakeholders for a more equitable work environment.

Keywords: delivery agents, equitable work environment, perception of job dignity, work-life balance

Procedia PDF Downloads 35
569 Voice Signal Processing and Coding in MATLAB Generating a Plasma Signal in a Tesla Coil for a Security System

Authors: Juan Jimenez, Erika Yambay, Dayana Pilco, Brayan Parra

Abstract:

This paper presents an investigation of voice signal processing and coding using MATLAB, with the objective of generating a plasma signal on a Tesla coil within a security system. The approach focuses on using advanced voice signal processing techniques to encode and modulate the audio signal, which is then amplified and applied to a Tesla coil. The result is the creation of a striking visual effect of voice-controlled plasma with specific applications in security systems. The article explores the technical aspects of voice signal processing, the generation of the plasma signal, and its relationship to security. The implications and creative potential of this technology are discussed, highlighting its relevance at the forefront of research in signal processing and visual effect generation in the field of security systems.

Keywords: voice signal processing, voice signal coding, MATLAB, plasma signal, Tesla coil, security system, visual effects, audiovisual interaction

Procedia PDF Downloads 52
568 Analysis of Joint Source Channel LDPC Coding for Correlated Sources Transmission over Noisy Channels

Authors: Marwa Ben Abdessalem, Amin Zribi, Ammar Bouallègue

Abstract:

In this paper, a Joint Source Channel coding scheme based on LDPC codes is investigated. We consider two concatenated LDPC codes, one allows to compress a correlated source and the second to protect it against channel degradations. The original information can be reconstructed at the receiver by a joint decoder, where the source decoder and the channel decoder run in parallel by transferring extrinsic information. We investigate the performance of the JSC LDPC code in terms of Bit-Error Rate (BER) in the case of transmission over an Additive White Gaussian Noise (AWGN) channel, and for different source and channel rate parameters. We emphasize how JSC LDPC presents a performance tradeoff depending on the channel state and on the source correlation. We show that, the JSC LDPC is an efficient solution for a relatively low Signal-to-Noise Ratio (SNR) channel, especially with highly correlated sources. Finally, a source-channel rate optimization has to be applied to guarantee the best JSC LDPC system performance for a given channel.

Keywords: AWGN channel, belief propagation, joint source channel coding, LDPC codes

Procedia PDF Downloads 329
567 Unequal Error Protection of VQ Image Transmission System

Authors: Khelifi Mustapha, A. Moulay lakhdar, I. Elawady

Abstract:

We will study the unequal error protection for VQ image. We have used the Reed Solomon (RS) Codes as Channel coding because they offer better performance in terms of channel error correction over a binary output channel. One such channel (binary input and output) should be considered if it is the case of the application layer, because it includes all the features of the layers located below and on the what it is usually not feasible to make changes.

Keywords: vector quantization, channel error correction, Reed-Solomon channel coding, application

Procedia PDF Downloads 332
566 Analysis of Non-Coding Genome in Streptococcus pneumoniae for Molecular Epidemiology Typing

Authors: Martynova Alina, Lyubov Buzoleva

Abstract:

Streptococcus pneumoniae is the causative agent of pneumonias and meningitids throught all the world. Having high genetic diversity, this microorganism can cause different clinical forms of pneumococcal infections and microbiologically it is really difficult diagnosed by routine methods. Also, epidemiological surveillance requires more developed methods of molecular typing because the recent method of serotyping doesn't allow to distinguish invasive and non-invasive isolates properly. Non-coding genome of bacteria seems to be the interesting source for seeking of highly distinguishable markers to discriminate the subspecies of such a variable bacteria as Streptococcus pneumoniae. Technically, we proposed scheme of discrimination of S.pneumoniae strains with amplification of non-coding region (SP_1932) with the following restriction with 2 types of enzymes of Alu1 and Mn1. Aim: This research aimed to compare different methods of typing and their application for molecular epidemiology purposes. Methods: we analyzed population of 100 strains of S.pneumoniae isolated from different patients by different molecular epidemiology methods such as pulse-field gel electophoresis (PFGE), restriction polymorphism analysis (RFLP) and multilolocus sequence typing (MLST), and all of them were compared with classic typing method as serotyping. The discriminative power was estimated with Simpson Index (SI). Results: We revealed that the most discriminative typing method is RFLP (SI=0,97, there were distinguished 42 genotypes).PFGE was slightly less discriminative (SI=0,95, we identified 35 genotypes). MLST is still the best reference method (SI=1.0). Classic method of serotyping showed quite weak discriminative power (SI=0,93, 24 genotypes). In addition, sensivity of RFLP was 100%, specificity was 97,09%. Conclusion: the most appropriate method for routine epidemiology surveillance is RFLP with non-coding region of Streptococcsu pneumoniae, then PFGE, though in some cases these results should be obligatory confirmed by MLST.

Keywords: molecular epidemiology typing, non-coding genome, Streptococcus pneumoniae, MLST

Procedia PDF Downloads 358
565 Subband Coding and Glottal Closure Instant (GCI) Using SEDREAMS Algorithm

Authors: Harisudha Kuresan, Dhanalakshmi Samiappan, T. Rama Rao

Abstract:

In modern telecommunication applications, Glottal Closure Instants location finding is important and is directly evaluated from the speech waveform. Here, we study the GCI using Speech Event Detection using Residual Excitation and the Mean Based Signal (SEDREAMS) algorithm. Speech coding uses parameter estimation using audio signal processing techniques to model the speech signal combined with generic data compression algorithms to represent the resulting modeled in a compact bit stream. This paper proposes a sub-band coder SBC, which is a type of transform coding and its performance for GCI detection using SEDREAMS are evaluated. In SBCs code in the speech signal is divided into two or more frequency bands and each of these sub-band signal is coded individually. The sub-bands after being processed are recombined to form the output signal, whose bandwidth covers the whole frequency spectrum. Then the signal is decomposed into low and high-frequency components and decimation and interpolation in frequency domain are performed. The proposed structure significantly reduces error, and precise locations of Glottal Closure Instants (GCIs) are found using SEDREAMS algorithm.

Keywords: SEDREAMS, GCI, SBC, GOI

Procedia PDF Downloads 326
564 Functional Variants Detection by RNAseq

Authors: Raffaele A. Calogero

Abstract:

RNAseq represents an attractive methodology for the detection of functional genomic variants. RNAseq results obtained from polyA+ RNA selection protocol (POLYA) and from exonic regions capturing protocol (ACCESS) indicate that ACCESS detects 10% more coding SNV/INDELs with respect to POLYA. ACCESS requires less reads for coding SNV detection with respect to POLYA. However, if the analysis aims at identifying SNV/INDELs also in the 5’ and 3’ UTRs, POLYA is definitively the preferred method. No particular advantage comes from ACCESS or POLYA in the detection of fusion transcripts.

Keywords: fusion transcripts, INDEL, RNA-seq, WES, SNV

Procedia PDF Downloads 262
563 Secure Network Coding against Content Pollution Attacks in Named Data Network

Authors: Tao Feng, Xiaomei Ma, Xian Guo, Jing Wang

Abstract:

Named Data Network (NDN) is one of the future Internet architecture, all nodes (i.e., hosts, routers) are allowed to have a local cache, used to satisfy incoming requests for content. However, depending on caching allows an adversary to perform attacks that are very effective and relatively easy to implement, such as content pollution attack. In this paper, we use a method of secure network coding based on homomorphic signature system to solve this problem. Firstly ,we use a dynamic public key technique, our scheme for each generation authentication without updating the initial secret key used. Secondly, employing the homomorphism of hash function, intermediate node and destination node verify the signature of the received message. In addition, when the network topology of NDN is simple and fixed, the code coefficients in our scheme are generated in a pseudorandom number generator in each node, so the distribution of the coefficients is also avoided. In short, our scheme not only can efficiently prevent against Intra/Inter-GPAs, but also can against the content poisoning attack in NDN.

Keywords: named data networking, content polloution attack, network coding signature, internet architecture

Procedia PDF Downloads 302
562 Analysis of the Impact of Foreign Direct Investment on the Integration of the Automotive Industry of Iran into Global Production Networks

Authors: Bahareh Mostofian

Abstract:

Foreign Direct Investment (FDI) has long been recognized as a crucial driver of economic growth and development in less-developed countries and their integration into Global Production Networks (GPNs). FDI not only brings capital from the core countries but also technology, innovation, and know-how knowledge that can upgrade the capabilities of host automotive industries. On the other hand, FDI can also have negative impacts on host countries if it leads to significant import dependency. In the case of the Iranian automotive sector, the industry greatly benefited from FDI, with Western carmakers dominating the market. Over time, various types of know-how knowledge, including joint ventures (JVs), trade licenses, and technical assistance, have been provided, helping Iran upgrade its automotive industry. While after the severe geopolitical obstacles imposed by both the EU and the U.S., the industry became over-reliant on the car and spare parts imports, and the lack of emphasis on knowledge transfer further affected the growth and development of the Iranian automotive sector. To address these challenges, current research has adopted a descriptive-analytical methodology to illustrate the gradual changes accrued with foreign suppliers through FDI. The research finding shows that after the two-phase imposed sanctions, the detrimental linkages created by overreliance on the car and spare parts imports without any industrial upgrading negatively affected the growth and development of the national and assembled products of the Iranian automotive sector.

Keywords: less-developed country, FDI, GPNs, automotive industry, Iran

Procedia PDF Downloads 43
561 Domains of Socialization Interview: Development and Psychometric Properties

Authors: Dilek Saritas Atalar, Cansu Alsancak Akbulut, İrem Metin Orta, Feyza Yön, Zeynep Yenen, Joan Grusec

Abstract:

Objective: The aim of this study was to develop semi-structured Domains of Socialization Interview and its coding manual and to test their psychometric properties. Domains of Socialization Interview was designed to assess maternal awareness regarding effective parenting in five socialization domains (protection, mutual reciprocity, control, guided learning, and group participation) within the framework of the domains-of-socialization approach. Method: A series of two studies were conducted to develop and validate the interview and its coding manual. The pilot study, sampled 13 mothers of preschool-aged children, was conducted to develop the assessment tools and to test their function and clarity. Participants of the main study were 82 Turkish mothers (Xage = 34.25, SD = 3.53) who have children aged between 35-76 months (Xage = 50.75, SD = 11.24). Mothers filled in a questionnaire package including Coping with Children’s Negative Emotions Questionnaire, Social Competence and Behavior Evaluation-30, Child Rearing Questionnaire, and Two Dimensional Social Desirability Questionnaire. Afterward, interviews were conducted online by a single interviewer. Interviews were rated independently by two graduate students based on the coding manual. Results: The relationships of the awareness of effective parenting scores to the other measures demonstrate convergent, discriminant, and predictive validity of the coding manual. Intra-class correlation coefficient estimates were ranged between 0.82 and 0.90, showing high interrater reliability of the coding manual. Conclusion: Taken as a whole, the results of these studies demonstrate the validity and reliability of a new and useful interview to measure maternal awareness regarding effective parenting within the framework of the domains-of-socialization approach.

Keywords: domains of socialization, parenting, interview, assessment

Procedia PDF Downloads 153
560 Relating Symptoms with Protein Production Abnormality in Patients with Down Syndrome

Authors: Ruolan Zhou

Abstract:

Trisomy of human chromosome 21 is the primary cause of Down Syndrome (DS), and this genetic disease has significantly burdened families and countries, causing great controversy. To address this problem, the research takes an approach in exploring the relationship between genetic abnormality and this disease's symptoms, adopting several techniques, including data analysis and enrichment analysis. It also explores open-source websites, such as NCBI, DAVID, SOURCE, STRING, as well as UCSC, to complement its result. This research has analyzed the variety of genes on human chromosome 21 with simple coding, and by using analysis, it has specified the protein-coding genes, their function, and their location. By using enrichment analysis, this paper has found the abundance of keratin production-related coding-proteins on human chromosome 21. By adopting past researches, this research has attempted to disclose the relationship between trisomy of human chromosome 21 and keratin production abnormality, which might be the reason for common diseases in patients with Down Syndrome. At last, by addressing the advantage and insufficiency of this research, the discussion has provided specific directions for future research.

Keywords: Down Syndrome, protein production, genome, enrichment analysis

Procedia PDF Downloads 95
559 Performance Analysis of MIMO-OFDM Using Convolution Codes with QAM Modulation

Authors: I Gede Puja Astawa, Yoedy Moegiharto, Ahmad Zainudin, Imam Dui Agus Salim, Nur Annisa Anggraeni

Abstract:

Performance of Orthogonal Frequency Division Multiplexing (OFDM) system can be improved by adding channel coding (error correction code) to detect and correct the errors that occur during data transmission. One can use the convolution code. This paper presents performance of OFDM using Space Time Block Codes (STBC) diversity technique use QAM modulation with code rate 1/2. The evaluation is done by analyzing the value of Bit Error Rate (BER) vs. Energy per Bit to Noise Power Spectral Density Ratio (Eb/No). This scheme is conducted 256 sub-carrier which transmits Rayleigh multipath channel in OFDM system. To achieve a BER of 10-3 is required 30 dB SNR in SISO-OFDM scheme. For 2x2 MIMO-OFDM scheme requires 10 dB to achieve a BER of 10-3. For 4x4 MIMO-OFDM scheme requires 5 dB while adding convolution in a 4x4 MIMO-OFDM can improve performance up to 0 dB to achieve the same BER. This proves the existence of saving power by 3 dB of 4x4 MIMO-OFDM system without coding, power saving 7 dB of 2x2 MIMO-OFDM system without coding and significant power savings from SISO-OFDM system.

Keywords: convolution code, OFDM, MIMO, QAM, BER

Procedia PDF Downloads 364
558 International Classification of Primary Care as a Reference for Coding the Demand for Care in Primary Health Care

Authors: Souhir Chelly, Chahida Harizi, Aicha Hechaichi, Sihem Aissaoui, Leila Ben Ayed, Maha Bergaoui, Mohamed Kouni Chahed

Abstract:

Introduction: The International Classification of Primary Care (ICPC) is part of the morbidity classification system. It had 17 chapters, and each is coded by an alphanumeric code: the letter corresponds to the chapter, the number to a paragraph in the chapter. The objective of this study is to show the utility of this classification in the coding of the reasons for demand for care in Primary health care (PHC), its advantages and limits. Methods: This is a cross-sectional descriptive study conducted in 4 PHC in Ariana district. Data on the demand for care during 2 days in the same week were collected. The coding of the information was done according to the CISP. The data was entered and analyzed by the EPI Info 7 software. Results: A total of 523 demands for care were investigated. The patients who came for the consultation are predominantly female (62.72%). Most of the consultants are young with an average age of 35 ± 26 years. In the ICPC, there are 7 rubrics: 'infections' is the most common reason with 49.9%, 'other diagnoses' with 40.2%, 'symptoms and complaints' with 5.5%, 'trauma' with 2.1%, 'procedures' with 2.1% and 'neoplasm' with 0.3%. The main advantage of the ICPC is the fact of being a standardized tool. It is very suitable for classification of the reasons for demand for care in PHC according to their specificity, capacity to be used in a computerized medical file of the PHC. Its current limitations are related to the difficulty of classification of some reasons for demand for care. Conclusion: The ICPC has been developed to provide healthcare with a coding reference that takes into account their specificity. The CIM is in its 10th revision; it would gain from revision to revision to be more efficient to be generalized and used by the teams of PHC.

Keywords: international classification of primary care, medical file, primary health care, Tunisia

Procedia PDF Downloads 234
557 Evaluation of the Role of Circulating Long Non-Coding RNA H19 as a Promising Biomarker in Plasma of Patients with Gastric Cancer

Authors: Doaa Hashad, Amany Elbanna, Abeer Ibrahim, Gihan Khedr

Abstract:

Background: H19 is one of the long non coding RNAs (LncRNA) that is related to the progression of many diseases including cancers. This work was carried out to study the level of the long non-coding RNA; H119, in plasma of patients with gastric cancer (GC) and to assess its significance in their clinical management. Methods: A total of sixty-two participants were enrolled in the present study. The first group included thirty-two GC patients, while the second group was formed of thirty age and sex matched healthy volunteers serving as a control group. Plasma samples were used to assess H19 gene expression using real time quantitative PCR technique. Results: H19 expression was up-regulated in GC patients with positive correlation to TNM cancer stages. Conclusions: Up-regulation of H19 is closely associated with gastric cancer and correlates well with tumor staging. Convenient, efficient quantification of H19 in plasma using real time PCR technique implements its role as a potential noninvasive prognostic biomarker in gastric cancer, that predicts patient’s outcome and most importantly as a novel target in gastric cancer treatment with better performance achieved on using both CEA and H19 simultaneously.

Keywords: biomarker, gastric, cancer, LncRNA

Procedia PDF Downloads 275