Search results for: analog signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5164

Search results for: analog signal processing

2944 Oleic Acid Enhances Hippocampal Synaptic Efficacy

Authors: Rema Vazhappilly, Tapas Das

Abstract:

Oleic acid is a cis unsaturated fatty acid and is known to be a partially essential fatty acid due to its limited endogenous synthesis during pregnancy and lactation. Previous studies have demonstrated the role of oleic acid in neuronal differentiation and brain phospholipid synthesis. These evidences indicate a major role for oleic acid in learning and memory. Interestingly, oleic acid has been shown to enhance hippocampal long term potentiation (LTP), the physiological correlate of long term synaptic plasticity. However the effect of oleic acid on short term synaptic plasticity has not been investigated. Short term potentiation (STP) is the physiological correlate of short term synaptic plasticity which is the key underlying molecular mechanism of short term memory and neuronal information processing. STP in the hippocampal CA1 region has been known to require the activation of N-methyl-D-aspartate receptors (NMDARs). The NMDAR dependent hippocampal STP as a potential mechanism for short term memory has been a subject of intense interest for the past few years. Therefore in the present study the effect of oleic acid on NMDAR dependent hippocampal STP was determined in mouse hippocampal slices (in vitro) using Multi-electrode array system. STP was induced by weak tetanic Stimulation (one train of 100 Hz stimulations for 0.1s) of the Schaffer collaterals of CA1 region of the hippocampus in slices treated with different concentrations of oleic acid in presence or absence of NMDAR antagonist D-AP5 (30 µM) . Oleic acid at 20 (mean increase in fEPSP amplitude = ~135 % Vs. Control = 100%; P<0.001) and 30 µM (mean increase in fEPSP amplitude = ~ 280% Vs. Control = 100%); P<0.001) significantly enhanced the STP following weak tetanic stimulation. Lower oleic acid concentrations at 10 µM did not modify the hippocampal STP induced by weak tetanic stimulation. The hippocampal STP induced by weak tetanic stimulation was completely blocked by the NMDA receptor antagonist D-AP5 (30µM) in both oleic acid and control treated hippocampal slices. This lead to the conclusion that the hippocampal STP elicited by weak tetanic stimulation and enhanced by oleic acid was NMDAR dependent. Together these findings suggest that oleic acid may enhance the short term memory and neuronal information processing through the modulation of NMDAR dependent hippocampal short-term synaptic plasticity. In conclusion this study suggests the possible role of oleic acid to prevent the short term memory loss and impaired neuronal function throughout development.

Keywords: oleic acid, short-term potentiation, memory, field excitatory post synaptic potentials, NMDA receptor

Procedia PDF Downloads 335
2943 Agenesis of the Corpus Callosum: The Role of Neuropsychological Assessment with Implications to Psychosocial Rehabilitation

Authors: Ron Dick, P. S. D. V. Prasadarao, Glenn Coltman

Abstract:

Agenesis of the corpus callosum (ACC) is a failure to develop corpus callosum - the large bundle of fibers of the brain that connects the two cerebral hemispheres. It can occur as a partial or complete absence of the corpus callosum. In the general population, its estimated prevalence rate is 1 in 4000 and a wide range of genetic, infectious, vascular, and toxic causes have been attributed to this heterogeneous condition. The diagnosis of ACC is often achieved by neuroimaging procedures. Though persons with ACC can perform normally on intelligence tests they generally present with a range of neuropsychological and social deficits. The deficit profile is characterized by poor coordination of motor movements, slow reaction time, processing speed and, poor memory. Socially, they present with deficits in communication, language processing, the theory of mind, and interpersonal relationships. The present paper illustrates the role of neuropsychological assessment with implications to psychosocial management in a case of agenesis of the corpus callosum. Method: A 27-year old left handed Caucasian male with a history of ACC was self-referred for a neuropsychological assessment to assist him in his employment options. Parents noted significant difficulties with coordination and balance at an early age of 2-3 years and he was diagnosed with dyspraxia at the age of 14 years. History also indicated visual impairment, hypotonia, poor muscle coordination, and delayed development of motor milestones. MRI scan indicated agenesis of the corpus callosum with ventricular morphology, widely spaced parallel lateral ventricles and mild dilatation of the posterior horns; it also showed colpocephaly—a disproportionate enlargement of the occipital horns of the lateral ventricles which might be affecting his motor abilities and visual defects. The MRI scan ruled out other structural abnormalities or neonatal brain injury. At the time of assessment, the subject presented with such problems as poor coordination, slowed processing speed, poor organizational skills and time management, and difficulty with social cues and facial expressions. A comprehensive neuropsychological assessment was planned and conducted to assist in identifying the current neuropsychological profile to facilitate the formulation of a psychosocial and occupational rehabilitation programme. Results: General intellectual functioning was within the average range and his performance on memory-related tasks was adequate. Significant visuospatial and visuoconstructional deficits were evident across tests; constructional difficulties were seen in tasks such as copying a complex figure, building a tower and manipulating blocks. Poor visual scanning ability and visual motor speed were evident. Socially, the subject reported heightened social anxiety, difficulty in responding to cues in the social environment, and difficulty in developing intimate relationships. Conclusion: Persons with ACC are known to present with specific cognitive deficits and problems in social situations. Findings from the current neuropsychological assessment indicated significant visuospatial difficulties, poor visual scanning and problems in social interactions. His general intellectual functioning was within the average range. Based on the findings from the comprehensive neuropsychological assessment, a structured psychosocial rehabilitation programme was developed and recommended.

Keywords: agenesis, callosum, corpus, neuropsychology, psychosocial, rehabilitation

Procedia PDF Downloads 276
2942 Approaches to Inducing Obsessional Stress in Obsessive-Compulsive Disorder (OCD): An Empirical Study with Patients Undergoing Transcranial Magnetic Stimulation (TMS) Therapy

Authors: Lucia Liu, Matthew Koziol

Abstract:

Obsessive-compulsive disorder (OCD), a long-lasting anxiety disorder involving recurrent, intrusive thoughts, affects over 2 million adults in the United States. Transcranial magnetic stimulation (TMS) stands out as a noninvasive, cutting-edge therapy that has been shown to reduce symptoms in patients with treatment-resistant OCD. The Food and Drug Administration (FDA) approved protocol pairs TMS sessions with individualized symptom provocation, aiming to improve the susceptibility of brain circuits to stimulation. However, limited standardization or guidance exists on how to conduct symptom provocation and which methods are most effective. This study aims to compare the effect of internal versus external techniques to induce obsessional stress in a clinical setting during TMS therapy. Two symptom provocation methods, (i) Asking patients thought-provoking questions about their obsessions (internal) and (ii) Requesting patients to perform obsession-related tasks (external), were employed in a crossover design with repeated measurement. Thirty-six treatments of NeuroStar TMS were administered to each of two patients over 8 weeks in an outpatient clinic. Patient One received 18 sessions of internal provocation followed by 18 sessions of external provocation, while Patient Two received 18 sessions of external provocation followed by 18 sessions of internal provocation. The primary outcome was the level of self-reported obsessional stress on a visual analog scale from 1 to 10. The secondary outcome was self-reported OCD severity, collected biweekly in a four-level Likert-scale (1 to 4) of bad, fair, good and excellent. Outcomes were compared and tested between provocation arms through repeated measures ANOVA, accounting for intra-patient correlations. Ages were 42 for Patient One (male, White) and 57 for Patient Two (male, White). Both patients had similar moderate symptoms at baseline, as determined through the Yale-Brown Obsessive Compulsive Scale (YBOCS). When comparing obsessional stress induced across the two arms of internal and external provocation methods, the mean (SD) was 6.03 (1.18) for internal and 4.01 (1.28) for external strategies (P=0.0019); ranges were 3 to 8 for internal and 2 to 8 for external strategies. Internal provocation yielded 5 (31.25%) bad, 6 (33.33%) fair, 3 (18.75%) good, and 2 (12.5%) excellent responses for OCD status, while external provocation yielded 5 (31.25%) bad, 9 (56.25%) fair, 1 (6.25%) good, and 1 (6.25%) excellent responses (P=0.58). Internal symptom provocation tactics had a significantly stronger impact on inducing obsessional stress and led to better OCD status (non-significant). This could be attributed to the fact that answering questions may prompt patients to reflect more on their lived experiences and struggles with OCD. In the future, clinical trials with larger sample sizes are warranted to validate this finding. Results support the increased integration of internal methods into structured provocation protocols, potentially reducing the time required for provocation and achieving greater treatment response to TMS.

Keywords: obsessive-compulsive disorder, transcranial magnetic stimulation, mental health, symptom provocation

Procedia PDF Downloads 57
2941 Assessment of Green Fluorescent Protein Signal for Effective Monitoring of Recombinant Fermentation Processes

Authors: I. Sani, A. Abdulhamid, F. Bello, Isah M. Fakai

Abstract:

This research has focused on the application of green fluorescent protein (GFP) as a new technique for direct monitoring of fermentation processes involving cultured bacteria. To use GFP as a sensor for pH and oxygen, percentage ratio of red fluorescence to green (% R/G) was evaluated. Assessing the magnitude of the % R/G ratio in relation to low or high pH and oxygen concentration, the bacterial strains were cultivated under aerobic and anaerobic conditions. SCC1 strains of E. coli were grown in a 5 L laboratory fermenter, and during the fermentation, the pH and temperature were controlled at 7.0 and 370C respectively. Dissolved oxygen tension (DOT) was controlled between 15-100% by changing the agitation speed between 20-500 rpm respectively. Effect of reducing the DOT level from 100% to 15% was observed after 4.5 h fermentation. There was a growth arrest as indicated by the decrease in the OD650 at this time (4.5-5 h). The relative fluorescence (green) intensity was decreased from about 460 to 420 RFU. However, %R/G ratio was significantly increased from about 0.1% to about 0.25% when the DOT level was decreased to 15%. But when the DOT was changed to 100%, a little increase in the RF and decrease in the %R/G ratio were observed. Therefore, GFP can effectively detect and indicate any change in pH and oxygen level during fermentation processes.

Keywords: Escherichia coli SCC1, fermentation process, green fluorescent protein, red fluorescence

Procedia PDF Downloads 505
2940 Regulatory and Economic Challenges of AI Integration in Cyber Insurance

Authors: Shreyas Kumar, Mili Shangari

Abstract:

Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.

Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware

Procedia PDF Downloads 33
2939 Incidence of Fungal Infections and Mycotoxicosis in Pork Meat and Pork By-Products in Egyptian Markets

Authors: Ashraf Samir Hakim, Randa Mohamed Alarousy

Abstract:

The consumption of food contaminated with molds (microscopic filamentous fungi) and their toxic metabolites results in the development of food-borne mycotoxicosis. The spores of molds are ubiquitously spread in the environment and can be detected everywhere. Ochratoxin A is a potentially carcinogenic fungal toxin found in a variety of food commodities , not only is considered the most abundant and hence the most commonly detected member but also is the most toxic one.Ochratoxin A is the most abundant and hence the most commonly detected member, but is also the most toxic of the three. A very limited research works concerning foods of porcine origin in Egypt were obtained in spite of presence a considerable swine population and consumers. In this study, the quality of various ready-to-eat local and imported pork meat and meat byproducts sold in Egyptian markets as well as edible organs as liver and kidney were assessed for the presence of various molds and their toxins as a raw material. Mycological analysis was conducted on (n=110) samples which included pig livers n=10 and kidneys n=10 from the Basateen slaughter house; local n=70 and 20 imported processed pork meat byproducts.The isolates were identified using traditional mycological and biochemical tests while, Ochratoxin A levels were quantitatively analyzed using the high performance liquid. Results of conventional mycological tests for detecting the presence of fungal growth (yeasts or molds) were negative, while the results of mycotoxins concentrations were be greatly above the permiceable limits or "tolerable weekly intake" (TWI) of ochratoxin A established by EFSA in 2006 in local pork and pork byproducts while the imported samples showed a very slightly increasing.Since ochratoxin A is stable and generally resistant to heat and processing, control of ochratoxin A contamination lies in the control of the growth of the toxin-producing fungi. Effective prevention of ochratoxin A contamination therefore depends on good farming and agricultural practices. Good Agricultural Practices (GAP) including methods to reduce fungal infection and growth during harvest, storage, transport and processing provide the primary line of defense against contamination with ochratoxin A. To the best of our knowledge this is the first report of mycological assessment, especially the mycotoxins in pork byproducts in Egypt.

Keywords: Egyptian markets, mycotoxicosis, ochratoxin A, pork meat, pork by-products

Procedia PDF Downloads 466
2938 HPA Pre-Distorter Based on Neural Networks for 5G Satellite Communications

Authors: Abdelhamid Louliej, Younes Jabrane

Abstract:

Satellites are becoming indispensable assets to fifth-generation (5G) new radio architecture, complementing wireless and terrestrial communication links. The combination of satellites and 5G architecture allows consumers to access all next-generation services anytime, anywhere, including scenarios, like traveling to remote areas (without coverage). Nevertheless, this solution faces several challenges, such as a significant propagation delay, Doppler frequency shift, and high Peak-to-Average Power Ratio (PAPR), causing signal distortion due to the non-linear saturation of the High-Power Amplifier (HPA). To compensate for HPA non-linearity in 5G satellite transmission, an efficient pre-distorter scheme using Neural Networks (NN) is proposed. To assess the proposed NN pre-distorter, two types of HPA were investigated: Travelling Wave Tube Amplifier (TWTA) and Solid-State Power Amplifier (SSPA). The results show that the NN pre-distorter design presents EVM improvement by 95.26%. NMSE and ACPR were reduced by -43,66 dB and 24.56 dBm, respectively. Moreover, the system suffers no degradation of the Bit Error Rate (BER) for TWTA and SSPA amplifiers.

Keywords: satellites, 5G, neural networks, HPA, TWTA, SSPA, EVM, NMSE, ACPR

Procedia PDF Downloads 91
2937 The Role Of Diallyl Trisulfide As A Suppressor In Activated-Platelets Induced Human Breast Cancer MDA-MB-435s Cells Hematogenous Metastasis

Authors: Yuping Liu, Li Tao, Yin Lu

Abstract:

Accumulating evidence has been shown that diallyl trisulfide (DATS) from garlic may reduce the risk of developing several types of cancer. In view of the dynamic crosstalk interplayed by tumor cells and platelets in hematogenous metastasis, we demonstrate the effectiveness of DATS on the metastatic behaviors of MDA-MB-435s human breast cancer cell line co-incubated with activated platelets. Indeed, our data identified that DATS significantly blocked platelets fouction induced by PAF, followed by the decreased production of TXB2. DATS was found to dose-dependently suppressed MDA-MB-435s cell migration and invasion in presence of activated platelets by PAF in vitro. Furthermore, the expression, secretion and enzymatic activity of matrix metalloproteinase (MMP)-2/9, as well as the luciferase activity of upstream regulator NF-κB in MDA-MB-435s, were obviously diminished by DATS. In parallel, DATS blocked upstream NF-κB activation signaling complexes composed of extracellular signal-related kinase (ERK) as assessed by measuring the levels of the phosphorylated forms.

Keywords: DATS, ERK, metastasis, MMPs, NF-κB, platelet

Procedia PDF Downloads 386
2936 A Prospective Study on the Efficacy of Mesenchymal Stem Cells in Intervertebral Disc Regeneration

Authors: Prabhu Thangaraju, Manoj Deepak, A. Sivakumar

Abstract:

Removal of inter vertebral disc along with spinal fusion has many disadvantages such as causing stress fractures. If it is possible regenerate the spine it would be possible avoid the complications of the surgery and achieve better results. Our study involves the use of mesenchymal stem cells in regenerating the discs. Our study involved 10 patients who presented with degenerative disc disease between 2008-2011 in our hospital. After adequate pre-operative check prepared mesenchymal stem cells were injected into the disc spaces. These patients were subjected to conservative therapy for a minimum of six weeks before they were accepted into the study. They were followed up regularly for a minimum of 2years with serial radiographs and MRI. 8 out of the 10 patients had completed reduction in the pain. The T2 weighted MRI images in 9 out of the 10 patients showed a bright signal compared the previous Images which indicated that there was improvement in the hydration levels. From the case study of 10 patients who were subjected to mesenchymal cell therapy in our hospital, we can conclude that the use of mesenchymal cells in treatment of intervertebral disc degeneration in a safe and effective option.

Keywords: mesenchymal stem cells, intervertebral disc, the spine, disc degeneration

Procedia PDF Downloads 371
2935 PID Sliding Mode Control with Sliding Surface Dynamics based Continuous Control Action for Robotic Systems

Authors: Wael M. Elawady, Mohamed F. Asar, Amany M. Sarhan

Abstract:

This paper adopts a continuous sliding mode control scheme for trajectory tracking control of robot manipulators with structured and unstructured uncertain dynamics and external disturbances. In this algorithm, the equivalent control in the conventional sliding mode control is replaced by a PID control action. Moreover, the discontinuous switching control signal is replaced by a continuous proportional-integral (PI) control term such that the implementation of the proposed control algorithm does not require the prior knowledge of the bounds of unknown uncertainties and external disturbances and completely eliminates the chattering phenomenon of the conventional sliding mode control approach. The closed-loop system with the adopted control algorithm has been proved to be globally stable by using Lyapunov stability theory. Numerical simulations using the dynamical model of robot manipulators with modeling uncertainties demonstrate the superiority and effectiveness of the proposed approach in high speed trajectory tracking problems.

Keywords: PID, robot, sliding mode control, uncertainties

Procedia PDF Downloads 508
2934 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption

Authors: Koyangbo Guere Monguia Michel Alex Emmanuel

Abstract:

In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.

Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization

Procedia PDF Downloads 358
2933 Design and Implementation of an Image Based System to Enhance the Security of ATM

Authors: Seyed Nima Tayarani Bathaie

Abstract:

In this paper, an image-receiving system was designed and implemented through optimization of object detection algorithms using Haar features. This optimized algorithm served as face and eye detection separately. Then, cascading them led to a clear image of the user. Utilization of this feature brought about higher security by preventing fraud. This attribute results from the fact that services will be given to the user on condition that a clear image of his face has already been captured which would exclude the inappropriate person. In order to expedite processing and eliminating unnecessary ones, the input image was compressed, a motion detection function was included in the program, and detection window size was confined.

Keywords: face detection algorithm, Haar features, security of ATM

Procedia PDF Downloads 419
2932 Readout Development of a LGAD-based Hybrid Detector for Microdosimetry (HDM)

Authors: Pierobon Enrico, Missiaggia Marta, Castelluzzo Michele, Tommasino Francesco, Ricci Leonardo, Scifoni Emanuele, Vincezo Monaco, Boscardin Maurizio, La Tessa Chiara

Abstract:

Clinical outcomes collected over the past three decades have suggested that ion therapy has the potential to be a treatment modality superior to conventional radiation for several types of cancer, including recurrences, as well as for other diseases. Although the results have been encouraging, numerous treatment uncertainties remain a major obstacle to the full exploitation of particle radiotherapy. To overcome therapy uncertainties optimizing treatment outcome, the best possible radiation quality description is of paramount importance linking radiation physical dose to biological effects. Microdosimetry was developed as a tool to improve the description of radiation quality. By recording the energy deposition at the micrometric scale (the typical size of a cell nucleus), this approach takes into account the non-deterministic nature of atomic and nuclear processes and creates a direct link between the dose deposited by radiation and the biological effect induced. Microdosimeters measure the spectrum of lineal energy y, defined as the energy deposition in the detector divided by most probable track length travelled by radiation. The latter is provided by the so-called “Mean Chord Length” (MCL) approximation, and it is related to the detector geometry. To improve the characterization of the radiation field quality, we define a new quantity replacing the MCL with the actual particle track length inside the microdosimeter. In order to measure this new quantity, we propose a two-stage detector consisting of a commercial Tissue Equivalent Proportional Counter (TEPC) and 4 layers of Low Gain Avalanche Detectors (LGADs) strips. The TEPC detector records the energy deposition in a region equivalent to 2 um of tissue, while the LGADs are very suitable for particle tracking because of the thickness thinnable down to tens of micrometers and fast response to ionizing radiation. The concept of HDM has been investigated and validated with Monte Carlo simulations. Currently, a dedicated readout is under development. This two stages detector will require two different systems to join complementary information for each event: energy deposition in the TEPC and respective track length recorded by LGADs tracker. This challenge is being addressed by implementing SoC (System on Chip) technology, relying on Field Programmable Gated Arrays (FPGAs) based on the Zynq architecture. TEPC readout consists of three different signal amplification legs and is carried out thanks to 3 ADCs mounted on a FPGA board. LGADs activated strip signal is processed thanks to dedicated chips, and finally, the activated strip is stored relying again on FPGA-based solutions. In this work, we will provide a detailed description of HDM geometry and the SoC solutions that we are implementing for the readout.

Keywords: particle tracking, ion therapy, low gain avalanche diode, tissue equivalent proportional counter, microdosimetry

Procedia PDF Downloads 175
2931 Performance Analysis of M-Ary Pulse Position Modulation in Multihop Multiple Input Multiple Output-Free Space Optical System over Uncorrelated Gamma-Gamma Atmospheric Turbulence Channels

Authors: Hechmi Saidi, Noureddine Hamdi

Abstract:

The performance of Decode and Forward (DF) multihop Free Space Optical ( FSO) scheme deploying Multiple Input Multiple Output (MIMO) configuration under Gamma-Gamma (GG) statistical distribution, that adopts M-ary Pulse Position Modulation (MPPM) coding, is investigated. We have extracted exact and estimated values of Symbol-Error Rates (SERs) respectively. A closed form formula related to the Probability Density Function (PDF) is expressed for our designed system. Thanks to the use of DF multihop MIMO FSO configuration and MPPM signaling, atmospheric turbulence is combatted; hence the transmitted signal quality is improved.

Keywords: free space optical, multiple input multiple output, M-ary pulse position modulation, multihop, decode and forward, symbol error rate, gamma-gamma channel

Procedia PDF Downloads 199
2930 Grey Prediction of Atmospheric Pollutants in Shanghai Based on GM(1,1) Model Group

Authors: Diqin Qi, Jiaming Li, Siman Li

Abstract:

Based on the use of the three-point smoothing method for selectively processing original data columns, this paper establishes a group of grey GM(1,1) models to predict the concentration ranges of four major air pollutants in Shanghai from 2023 to 2024. The results indicate that PM₁₀, SO₂, and NO₂ maintain the national Grade I standards, while the concentration of PM₂.₅ has decreased but still remains within the national Grade II standards. Combining the forecast results, recommendations are provided for the Shanghai municipal government's efforts in air pollution prevention and control.

Keywords: atmospheric pollutant prediction, Grey GM(1, 1), model group, three-point smoothing method

Procedia PDF Downloads 35
2929 Measurement and Analysis of Building Penetration Loss for Mobile Networks in Tripoli Area

Authors: Tammam A. Benmusa, Mohamed A. Shlibek, Rawad M. Swesi

Abstract:

The investigation of Buildings Penetration Loss (BPL) of radio signal is getting more and more important. It plays an important role in calculating the indoor coverage for wireless communication networks. In this paper, the theory behind BPL and its mechanisms have been reviewed. The operating frequency, coverage area type, climate condition, time of measurement, and other factors affecting the values of BPL have been discussed. The practical part of this work was conducting 4000 measurements of BPL in different areas in the Libyan capital, Tripoli, to get empirical model for this loss. The measurements were taken for 2 different types of wireless communication networks; mobile telephone network (for Almadar company), which operates at 900 MHz and WiMAX network (LTT company) which operates at 2500 MHz. The results for each network were summarized and presented in several graphs. The graphs are showing how the BPL affected by: time of measurement, morphology (type of area), and climatic environment.

Keywords: building penetration loss, wireless network, mobile network, link budget, indoor network performance

Procedia PDF Downloads 384
2928 Effect of High-Pressure and Thermal Treatments on Quality Markers of Strawberry Nectars

Authors: Karen Louise Lacey, Dario Javier Pavon Vargas, Massimiliano Rinaldi, Luca Cattani, Sara Rainieri

Abstract:

The effects of high-pressure processing (HPP) and thermal treatments (TT) on quality markers of strawberry nectar (12 °Brix, 3,3 pH) was studied before and after treatments. TT and HPP treatments ensured a 3-log aerobic bacteria inactivation. No significant difference was detected in terms of pH and °Brix. TT samples were less red (a* less positive) than all HPP treated samples, while all samples were less red than the control. Apparent viscosity was significantly increased in all the HPP treatments, at 10 1/s shear rate, control was 79.04±7.94 mPa•s and the 600 MPa-20 min treatment were 327.10±1.64 mPa•s. This work suggests that HPP treatments may maintain the quality markers of strawberry nectar better.

Keywords: HPP, strawberry nectar, colour , viscosity

Procedia PDF Downloads 130
2927 Consumer Load Profile Determination with Entropy-Based K-Means Algorithm

Authors: Ioannis P. Panapakidis, Marios N. Moschakis

Abstract:

With the continuous increment of smart meter installations across the globe, the need for processing of the load data is evident. Clustering-based load profiling is built upon the utilization of unsupervised machine learning tools for the purpose of formulating the typical load curves or load profiles. The most commonly used algorithm in the load profiling literature is the K-means. While the algorithm has been successfully tested in a variety of applications, its drawback is the strong dependence in the initialization phase. This paper proposes a novel modified form of the K-means that addresses the aforementioned problem. Simulation results indicate the superiority of the proposed algorithm compared to the K-means.

Keywords: clustering, load profiling, load modeling, machine learning, energy efficiency and quality

Procedia PDF Downloads 164
2926 Reasons for Food Losses and Waste in Basic Production of Meat Sector in Poland

Authors: Sylwia Laba, Robert Laba, Krystian Szczepanski, Mikolaj Niedek, Anna Kaminska-Dworznicka

Abstract:

Meat and its products are considered food products, having the most unfavorable effect on the environment that requires rational management of these products and waste, originating throughout the whole chain of manufacture, processing, transport, and trade of meat. From the economic and environmental viewpoints, it is important to limit the losses and food wastage and the food waste in the whole meat sector. The link to basic production includes obtaining raw meat, i.e., animal breeding, management, and transport of animals to the slaughterhouse. Food is any substance or product, intended to be consumed by humans. It was determined (for the needs of the present studies) when the raw material is considered as a food. It is the moment when the animals are prepared to loading with the aim to be transported to a slaughterhouse and utilized for food purposes. The aim of the studies was to determine the reasons for loss generation in the basic production of the meat sector in Poland during the years 2017 – 2018. The studies on food losses and waste in the meat sector in basic production were carried out in two areas: red meat i.e., pork and beef and poultry meat. The studies of basic production were conducted in the period of March-May 2019 at the territory of the whole country on a representative trial of 278 farms, including 102 pork production, 55–beef production, and 121 poultry meat production. The surveys were carried out with the utilization of questionnaires by the PAPI (Paper & Pen Personal Interview) method; the pollsters conducted direct questionnaire interviews. Research results indicate that it is followed that any losses were not recorded during the preparation, loading, and transport of the animals to the slaughterhouse in 33% of the visited farms. In the farms where the losses were indicated, the crushing and suffocations, occurring during the production of pigs, beef cattle and poultry, were the main reasons for these losses. They constituted ca. 40% of the reported reasons. The stress generated by loading and transport caused 16 – 17% (depending on the season of the year) of the loss reasons. In the case of poultry production, in 2017, additionally, 10.7% of losses were caused by inappropriate conditions of loading and transportation, while in 2018 – 11.8%. The diseases were one of the reasons for the losses in pork and beef production (7% of the losses). The losses and waste, generated during livestock production and in meat processing and trade cannot be managed or recovered. They have to be disposed of. It is, therefore, important to prevent and minimize the losses throughout the whole production chain. It is possible to introduce the appropriate measures, connected mainly with the appropriate conditions and methods of animal loading and transport.

Keywords: food losses, food waste, livestock production, meat sector

Procedia PDF Downloads 144
2925 Assessing Supply Chain Performance through Data Mining Techniques: A Case of Automotive Industry

Authors: Emin Gundogar, Burak Erkayman, Nusret Sazak

Abstract:

Providing effective management performance through the whole supply chain is critical issue and hard to applicate. The proper evaluation of integrated data may conclude with accurate information. Analysing the supply chain data through OLAP (On-Line Analytical Processing) technologies may provide multi-angle view of the work and consolidation. In this study, association rules and classification techniques are applied to measure the supply chain performance metrics of an automotive manufacturer in Turkey. Main criteria and important rules are determined. The comparison of the results of the algorithms is presented.

Keywords: supply chain performance, performance measurement, data mining, automotive

Procedia PDF Downloads 513
2924 Management Information System to Help Managers for Providing Decision Making in an Organization

Authors: Ajayi Oluwasola Felix

Abstract:

Management information system (MIS) provides information for the managerial activities in an organization. The main purpose of this research is, MIS provides accurate and timely information necessary to facilitate the decision-making process and enable the organizations planning control and operational functions to be carried out effectively. Management information system (MIS) is basically concerned with processing data into information and is then communicated to the various departments in an organization for appropriate decision-making. MIS is a subset of the overall planning and control activities covering the application of humans technologies, and procedures of the organization. The information system is the mechanism to ensure that information is available to the managers in the form they want it and when they need it.

Keywords: Management Information Systems (MIS), information technology, decision-making, MIS in Organizations

Procedia PDF Downloads 557
2923 3 Phase Induction Motor Control Using Single Phase Input and GSM

Authors: Pooja S. Billade, Sanjay S. Chopade

Abstract:

This paper focuses on the design of three phase induction motor control using single phase input and GSM.The controller used in this work is a wireless speed control using a GSM technique that proves to be very efficient and reliable in applications.The most common principle is the constant V/Hz principle which requires that the magnitude and frequency of the voltage applied to the stator of a motor maintain a constant ratio. By doing this, the magnitude of the magnetic field in the stator is kept at an approximately constant level throughout the operating range. Thus, maximum constant torque producing capability is maintained. The energy that a switching power converter delivers to a motor is controlled by Pulse Width Modulated signals applied to the gates of the power transistors in H-bridge configuration. PWM signals are pulse trains with fixed frequency and magnitude and variable pulse width. When a PWM signal is applied to the gate of a power transistor, it causes the turn on and turns off intervals of the transistor to change from one PWM period.

Keywords: index terms— PIC, GSM (global system for mobile), LCD (Liquid Crystal Display), IM (Induction Motor)

Procedia PDF Downloads 448
2922 Classifications of Sleep Apnea (Obstructive, Central, Mixed) and Hypopnea Events Using Wavelet Packet Transform and Support Vector Machines (VSM)

Authors: Benghenia Hadj Abd El Kader

Abstract:

Sleep apnea events as obstructive, central, mixed or hypopnea are characterized by frequent breathing cessations or reduction in upper airflow during sleep. An advanced method for analyzing the patterning of biomedical signals to recognize obstructive sleep apnea and hypopnea is presented. In the aim to extract characteristic parameters, which will be used for classifying the above stated (obstructive, central, mixed) sleep apnea and hypopnea, the proposed method is based first on the analysis of polysomnography signals such as electrocardiogram signal (ECG) and electromyogram (EMG), then classification of the (obstructive, central, mixed) sleep apnea and hypopnea. The analysis is carried out using the wavelet transform technique in order to extract characteristic parameters whereas classification is carried out by applying the SVM (support vector machine) technique. The obtained results show good recognition rates using characteristic parameters.

Keywords: obstructive, central, mixed, sleep apnea, hypopnea, ECG, EMG, wavelet transform, SVM classifier

Procedia PDF Downloads 371
2921 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 76
2920 Robust Medical Image Watermarking Using Frequency Domain and Least Significant Bits Algorithms

Authors: Volkan Kaya, Ersin Elbasi

Abstract:

Watermarking and stenography are getting importance recently because of copyright protection and authentication. In watermarking we embed stamp, logo, noise or image to multimedia elements such as image, video, audio, animation and text. There are several works have been done in watermarking for different purposes. In this research work, we used watermarking techniques to embed patient information into the medical magnetic resonance (MR) images. There are two methods have been used; frequency domain (Digital Wavelet Transform-DWT, Digital Cosine Transform-DCT, and Digital Fourier Transform-DFT) and spatial domain (Least Significant Bits-LSB) domain. Experimental results show that embedding in frequency domains resist against one type of attacks, and embedding in spatial domain is resist against another group of attacks. Peak Signal Noise Ratio (PSNR) and Similarity Ratio (SR) values are two measurement values for testing. These two values give very promising result for information hiding in medical MR images.

Keywords: watermarking, medical image, frequency domain, least significant bits, security

Procedia PDF Downloads 288
2919 Structured Cross System Planning and Control in Modular Production Systems by Using Agent-Based Control Loops

Authors: Simon Komesker, Achim Wagner, Martin Ruskowski

Abstract:

In times of volatile markets with fluctuating demand and the uncertainty of global supply chains, flexible production systems are the key to an efficient implementation of a desired production program. In this publication, the authors present a holistic information concept taking into account various influencing factors for operating towards the global optimum. Therefore, a strategy for the implementation of multi-level planning for a flexible, reconfigurable production system with an alternative production concept in the automotive industry is developed. The main contribution of this work is a system structure mixing central and decentral planning and control evaluated in a simulation framework. The information system structure in current production systems in the automotive industry is rigidly hierarchically organized in monolithic systems. The production program is created rule-based with the premise of achieving uniform cycle time. This program then provides the information basis for execution in subsystems at the station and process execution level. In today's era of mixed-(car-)model factories, complex conditions and conflicts arise in achieving logistics, quality, and production goals. There is no provision for feedback loops of results from the process execution level (resources) and process supporting (quality and logistics) systems and reconsideration in the planning systems. To enable a robust production flow, the complexity of production system control is artificially reduced by the line structure and results, for example in material-intensive processes (buffers and safety stocks - two container principle also for different variants). The limited degrees of freedom of line production have produced the principle of progress figure control, which results in one-time sequencing, sequential order release, and relatively inflexible capacity control. As a result, modularly structured production systems such as modular production according to known approaches with more degrees of freedom are currently difficult to represent in terms of information technology. The remedy is an information concept that supports cross-system and cross-level information processing for centralized and decentralized decision-making. Through an architecture of hierarchically organized but decoupled subsystems, the paradigm of hybrid control is used, and a holonic manufacturing system is offered, which enables flexible information provisioning and processing support. In this way, the influences from quality, logistics, and production processes can be linked holistically with the advantages of mixed centralized and decentralized planning and control. Modular production systems also require modularly networked information systems with semi-autonomous optimization for a robust production flow. Dynamic prioritization of different key figures between subsystems should lead the production system to an overall optimum. The tasks and goals of quality, logistics, process, resource, and product areas in a cyber-physical production system are designed as an interconnected multi-agent-system. The result is an alternative system structure that executes centralized process planning and decentralized processing. An agent-based manufacturing control is used to enable different flexibility and reconfigurability states and manufacturing strategies in order to find optimal partial solutions of subsystems, that lead to a near global optimum for hybrid planning. This allows a robust near to plan execution with integrated quality control and intralogistics.

Keywords: holonic manufacturing system, modular production system, planning, and control, system structure

Procedia PDF Downloads 169
2918 About the Case Portfolio Management Algorithms and Their Applications

Authors: M. Chumburidze, N. Salia, T. Namchevadze

Abstract:

This work deal with case processing problems in business. The task of strategic credit requirements management of cases portfolio is discussed. The information model of credit requirements in a binary tree diagram is considered. The algorithms to solve issues of prioritizing clusters of cases in business have been investigated. An implementation of priority queues to support case management operations has been presented. The corresponding pseudo codes for the programming application have been constructed. The tools applied in this development are based on binary tree ordering algorithms, optimization theory, and business management methods.

Keywords: credit network, case portfolio, binary tree, priority queue, stack

Procedia PDF Downloads 150
2917 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 382
2916 Development of Technologies for Biotransformation of Aquatic Biological Resources for the Production of Functional, Specialized, Therapeutic, Preventive, and Microbiological Products

Authors: Kira Rysakova, Vitaly Novikov

Abstract:

An improved method of obtaining enzymatic collagen hydrolysate from the tissues of marine hydrobionts is proposed, which allows to obtain hydrolysate without pre-isolation of pure collagen. The method can be used to isolate enzymatic collagen hydrolysate from the waste of industrial processing of Red King crab and non-traditional objects - marine holothurias. Comparative analysis of collagen hydrolysates has shown the possibility of their use in a number of nutrient media, but this requires additional optimization of their composition and biological tests on wide sets of test strains of microorganisms.

Keywords: collagen hydrolysate, marine hydrobionts, red king crab, marine holothurias, enzymes, exclusive HPLC

Procedia PDF Downloads 169
2915 The Platform for Digitization of Georgian Documents

Authors: Erekle Magradze, Davit Soselia, Levan Shughliashvili, Irakli Koberidze, Shota Tsiskaridze, Victor Kakhniashvili, Tamar Chaghiashvili

Abstract:

Since the beginning of active publishing activity in Georgia, voluminous printed material has been accumulated, the digitization of which is an important task. Digitized materials will be available to the audience, and it will be possible to find text in them and conduct various factual research. Digitizing scanned documents means scanning documents, extracting text from the scanned documents, and processing the text into a corresponding language model to detect inaccuracies and grammatical errors. Implementing these stages requires a unified, scalable, and automated platform, where the digital service developed for each stage will perform the task assigned to it; at the same time, it will be possible to develop these services dynamically so that there is no interruption in the work of the platform.

Keywords: NLP, OCR, BERT, Kubernetes, transformers

Procedia PDF Downloads 144