Search results for: noise mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2209

Search results for: noise mapping

649 Implementation of a Quality Management Approach in the Laboratory of Quality Control and the Repression of Fraud (CACQE) of the Wilaya of Bechar

Authors: Khadidja Mebarki, Naceur Boussouar, Nabila Ihaddadene, M. Akermi

Abstract:

Food products are particularly sensitive, since they concern the health of the consumer, whether it’s be from the health point of view or commercial, this kind of product must be subjected to rigorous controls, in order to prevent any fraud. Quality and safety are essential for food security, public health and economic development. The strengthening of food security is essential to increase food security which is considered reached when all individuals can at any time access safe and nutritious food they need to lead healthy and active lives. The objective of this project is to initiate a quality approach in the laboratories of the quality control and the repression of fraud. It will be directed towards the application of good laboratory practices, traceability, management of quality documents (quality, procedures and specification manual) and quality audits. And to prepare the ground for a possible accreditation by ISO 17025 standard of BECHAR laboratory’s. The project will take place in four main stages: 1- Preparation of an audit grid; 2- Realization of a quality audit according to the method of 5 M completed by a section on quality documentation; 3- Drafting of an audit report and proposal for recommendations; 4- Implementation of corrective actions on the ground. This last step consisted in the formalization of the cleaning disinfection plan; work on good hygiene practices, establishment of a mapping of processes and flow charts of the different processes of the laboratory, classifying quality documents and formalizing the process of document management. During the period of the study within the laboratory, all facets of the work were almost appreciated, as we participated in the expertise performed in within it.

Keywords: quality, management, ISO 17025 accreditation, GLP

Procedia PDF Downloads 493
648 Using the SMT Solver to Minimize the Latency and to Optimize the Number of Cores in an NoC-DSP Architectures

Authors: Imen Amari, Kaouther Gasmi, Asma Rebaya, Salem Hasnaoui

Abstract:

The problem of scheduling and mapping data flow applications on multi-core architectures is notoriously difficult. This difficulty is related to the rapid evaluation of Telecommunication and multimedia systems accompanied by a rapid increase of user requirements in terms of latency, execution time, consumption, energy, etc. Having an optimal scheduling on multi-cores DSP (Digital signal Processors) platforms is a challenging task. In this context, we present a novel technic and algorithm in order to find a valid schedule that optimizes the key performance metrics particularly the Latency. Our contribution is based on Satisfiability Modulo Theories (SMT) solving technologies which is strongly driven by the industrial applications and needs. This paper, describe a scheduling module integrated in our proposed Workflow which is advised to be a successful approach for programming the applications based on NoC-DSP platforms. This workflow transform automatically a Simulink model to a synchronous dataflow (SDF) model. The automatic transformation followed by SMT solver scheduling aim to minimize the final latency and other software/hardware metrics in terms of an optimal schedule. Also, finding the optimal numbers of cores to be used. In fact, our proposed workflow taking as entry point a Simulink file (.mdl or .slx) derived from embedded Matlab functions. We use an approach which is based on the synchronous and hierarchical behavior of both Simulink and SDF. Whence, results of running the scheduler which exist in the Workflow mentioned above using our proposed SMT solver algorithm refinements produce the best possible scheduling in terms of latency and numbers of cores.

Keywords: multi-cores DSP, scheduling, SMT solver, workflow

Procedia PDF Downloads 268
647 Distribution of Gamma-Radiation Levels in Core Sediment Samples in Gulf of İzmir, Eastern Aegean Sea, Turkey

Authors: D. Kurt, İ. F. Barut, Z. Ü. Yümün, E. Kam

Abstract:

After development of the industrial revolution, industrial plants and settlements have spread widely on the sea coasts. This concentration also brings environmental pollution in the sea. This study focuses on the Gulf of İzmir where is located in West of Turkey and it is a fascinating natural gulf of the Eastern Aegean Sea. Investigating marine current sediment is extremely important to detect pollution. Natural radionuclides’ pollution of the marine environment which is also known as a significant environmental anxiety. Ground drilling cores (the depth of each sediment is variant) were collected from the Gulf of İzmir’s four different locations which were Karşıyaka, İnciraltı, Çeşmealtı and Bayraklı. These sediment cores were put in preserving bags with weight around 1 kg, and they were dried at room temperature in a week for moisture removal. Then, they were sieved with 1 mm sieve holes, and finally these powdered samples were relocation to polyethylene Marinelli beakers of 100 ml versions. Each prepared sediment was waited to reach radioactive equilibrium between uranium and thorium for 40 days. Gamma spectrometry measurements were settled using a HPG (High- Purity Germanium) semiconductor detector. Semiconductor detectors are very good at separating power of the energy, they are easily able to differentiate peaks that are pretty close to each other. That is why, gamma spectroscopy’s usage is common for the determination of the activities of U - 238, Th - 232, Ra - 226, Cr - 137 and K - 40 in Bq kg⁻¹. In this study, the results display that the average concentrations of activities’ values are in respectively; 2.2 ± 1.5 Bq/ kg⁻¹, 0.98 ± 0.02 Bq/ kg⁻¹, 8 ± 0.96 Bq/ kg⁻¹, 0.93 ± 0.14 Bq/ kg⁻¹, and 76.05 ± 0.93 Bq/ kg⁻¹. The outcomes of the study are able to be used as a criterion for forthcoming research and the obtained data would be pragmatic for radiological mapping of the precise areas.

Keywords: gamma, Gulf of İzmir (Eastern Aegean Sea-Turkey), natural radionuclides, pollution

Procedia PDF Downloads 244
646 Cars in a Neighborhood: A Case of Sustainable Living in Sector 22 Chandigarh

Authors: Maninder Singh

Abstract:

The Chandigarh city is under the strain of exponential growth of car density across various neighborhood. The consumerist nature of society today is to be blamed for this menace because everyone wants to own and ride a car. Car manufacturers are busy selling two or more cars per household. The Regional Transport Offices are busy issuing as many licenses to new vehicles as they can in order to generate revenue in the form of Road Tax. The car traffic in the neighborhoods of Chandigarh has reached a tipping point. There needs to be a more empirical and sustainable model of cars per household, which should be based on specific parameters of livable neighborhoods. Sector 22 in Chandigarh is one of the first residential sectors to be established in the city. There is scope to think, reflect, and work out a method to know how many cars we need to sell our citizens before we lose the argument to traffic problems, parking problems, and road rage. This is where the true challenge of a planner or a designer of the city lies. Currently, in Chandigarh city, there are no clear visible answers to this problem. The way forward is to look at spatial mapping, planning, and design of car parking units to address the problem, rather than suggesting extreme measures of banning cars (short-term) or promoting plans for citywide transport (very long-term). This is a chance to resolve the problem with a pragmatic approach from a citizen’s perspective, instead of an orthodox development planner’s methodology. Since citizens are at the center of how the problem is to be addressed, acceptable solutions are more likely to emerge from the car and traffic problem as defined by the citizens. Thus, the idea and its implementation would be interesting in comparison to the known academic methodologies. The novel and innovative process would lead to a more acceptable and sustainable approach to the issue of number of car parks in the neighborhood of Chandigarh city.

Keywords: cars, Chandigarh, neighborhood, sustainable living, walkability

Procedia PDF Downloads 126
645 Mapping the Technological Interventions to the National Action Plan for Marine Litter Management 2018-2025: Addressing the Marine Plastic Litter at the Marine Tourism Destinations in Indonesia

Authors: Kaisar Akhir, Azhar Slamet

Abstract:

This study aims to provide recommendations for addressing marine plastic litter at the ocean tourism destinations in Indonesia sustainably through technological interventions in the framework of the National Action Plan for Marine Litter Management 2018-2025. In Indonesia, marine tourism is a rapidly growing economic sector. However, marine tourism destinations are facing a global challenge called marine plastic litter. Marine plastic litter is a threat to those destinations since it has potential impacts on the reduction of marine environmental sustainability, the health of tourists and local communities as well as tourism business income. Since 2018, the Indonesian government has passed and promulgated the National Plan of Action on Marine Litter Management 2018-2025. This national action plan consists of three important key aspects of interventions (i.e., societal effort, technological application, and institutional coordination) and five strategies for addressing marine litter in Indonesia, in particular, to address 70% of marine plastic litter by 2025. The strategies include 1) National movement for raising awareness of stakeholders, 2) Land-based litter management, 3) Litter management at the sea and coasts, 4) Funding mechanism, institutional strengthening, monitoring, and law enforcement, and 5) Research and development. In this study, technological interventions around the world and in Indonesia are reviewed and analyzed on their relevance to the national action plan based on five criteria. As a result, there are twelve kinds of technological interventions recommended to be implemented for addressing marine plastic litter in the marine tourism destinations in Indonesia.

Keywords: marine litter management, marine plastic litter, national action plan, ocean sustainability, ocean tourism destination, technological interventions

Procedia PDF Downloads 149
644 Integration of GIS with Remote Sensing and GPS for Disaster Mitigation

Authors: Sikander Nawaz Khan

Abstract:

Natural disasters like flood, earthquake, cyclone, volcanic eruption and others are causing immense losses to the property and lives every year. Current status and actual loss information of natural hazards can be determined and also prediction for next probable disasters can be made using different remote sensing and mapping technologies. Global Positioning System (GPS) calculates the exact position of damage. It can also communicate with wireless sensor nodes embedded in potentially dangerous places. GPS provide precise and accurate locations and other related information like speed, track, direction and distance of target object to emergency responders. Remote Sensing facilitates to map damages without having physical contact with target area. Now with the addition of more remote sensing satellites and other advancements, early warning system is used very efficiently. Remote sensing is being used both at local and global scale. High Resolution Satellite Imagery (HRSI), airborne remote sensing and space-borne remote sensing is playing vital role in disaster management. Early on Geographic Information System (GIS) was used to collect, arrange, and map the spatial information but now it has capability to analyze spatial data. This analytical ability of GIS is the main cause of its adaption by different emergency services providers like police and ambulance service. Full potential of these so called 3S technologies cannot be used in alone. Integration of GPS and other remote sensing techniques with GIS has pointed new horizons in modeling of earth science activities. Many remote sensing cases including Asian Ocean Tsunami in 2004, Mount Mangart landslides and Pakistan-India earthquake in 2005 are described in this paper.

Keywords: disaster mitigation, GIS, GPS, remote sensing

Procedia PDF Downloads 450
643 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 137
642 Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area

Authors: Nassib Abdallah, Pierre Chauvet, Abd El Salam Hajjar, Bassam Daya

Abstract:

In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words.

Keywords: brain-computer interface, speech recognition, artificial neural network, electroencephalography, EEG, wernicke area

Procedia PDF Downloads 255
641 Single Atom Manipulation with 4 Scanning Tunneling Microscope Technique

Authors: Jianshu Yang, Delphine Sordes, Marek Kolmer, Christian Joachim

Abstract:

Nanoelectronics, for example the calculating circuits integrating at molecule scale logic gates, atomic scale circuits, has been constructed and investigated recently. A major challenge is their functional properties characterization because of the connecting problem from atomic scale to micrometer scale. New experimental instruments and new processes have been proposed therefore. To satisfy a precisely measurement at atomic scale and then connecting micrometer scale electrical integration controller, the technique improvement is kept on going. Our new machine, a low temperature high vacuum four scanning tunneling microscope, as a customer required instrument constructed by Omicron GmbH, is expected to be scaling down to atomic scale characterization. Here, we will present our first testified results about the performance of this new instrument. The sample we selected is Au(111) surface. The measurements have been taken at 4.2 K. The atomic resolution surface structure was observed with each of four scanners with noise level better than 3 pm. With a tip-sample distance calibration by I-z spectra, the sample conductance has been derived from its atomic locally I-V spectra. Furthermore, the surface conductance measurement has been performed using two methods, (1) by landing two STM tips on the surface with sample floating; and (2) by sample floating and one of the landed tips turned to be grounding. In addition, single atom manipulation has been achieved with a modified tip design, which is comparable to a conventional LT-STM.

Keywords: low temperature ultra-high vacuum four scanning tunneling microscope, nanoelectronics, point contact, single atom manipulation, tunneling resistance

Procedia PDF Downloads 263
640 Spatial and Geostatistical Analysis of Surficial Soils of the Contiguous United States

Authors: Rachel Hetherington, Chad Deering, Ann Maclean, Snehamoy Chatterjee

Abstract:

The U.S. Geological Survey conducted a soil survey and subsequent mineralogical and geochemical analyses of over 4800 samples taken across the contiguous United States between the years 2007 and 2013. At each location, samples were taken from the top 5 cm, the A-horizon, and the C-horizon. Many studies have looked at the correlation between the mineralogical and geochemical content of soils and influencing factors such as parent lithology, climate, soil type, and age, but it seems little has been done in relation to quantifying and assessing the correlation between elements in the soil on a national scale. GIS was used for the mapping and multivariate interpolation of over 40 major and trace elements for surficial soils (0-5 cm depth). Qualitative analysis of the spatial distribution across the U.S. shows distinct patterns amongst elements both within the same periodic groups and within different periodic groups, and therefore with different behavioural characteristics. Results show the emergence of 4 main patterns of high concentration areas: vertically along the west coast, a C-shape formed through the states around Utah and northern Arizona, a V-shape through the Midwest and connecting to the Appalachians, and along the Appalachians. The Band Collection Statistics tool in GIS was used to quantitatively analyse the geochemical raster datasets and calculate a correlation matrix. Patterns emerged, which were not identified in qualitative analysis, many of which are also amongst elements with very different characteristics. Preliminary results show 41 element pairings with a strong positive correlation ( ≥ 0.75). Both qualitative and quantitative analyses on this scale could increase knowledge on the relationships between element distribution and behaviour in surficial soils of the U.S.

Keywords: correlation matrix, geochemical analyses, spatial distribution of elements, surficial soils

Procedia PDF Downloads 111
639 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm

Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin

Abstract:

Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.

Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform

Procedia PDF Downloads 508
638 Continuity of Place-Identity: Identifying Regional Components of Kerala Architecture through 1805-1950

Authors: Manoj K. Kumar, Deepthi Bathala

Abstract:

Man has the need to know and feel as a part of the historical continuum and it is this continuum that reinforces his identity. Architecture and the built environment contribute to this identity as established by the various identity theories exploring the relationship between the two. Architecture which is organic has been successful in maintaining a continuum of identity until the advent of globalization when the world saw a drastic shift to architecture of ‘placelessness’. The answer to the perfect synthesis of ‘universalization’ and ‘regionalism’ is an ongoing quest. However, history has established a smooth transition from vernacular to colonial to modern unlike the architecture of today. The traditional Kerala architecture has evolved from the tropical climate, geography, local needs, materials, skills and foreign influences. It is unique in contrast to the architecture of the neighboring states as a result of the geographical barriers however influenced by the architecture of the Orient due to trade relations. Through 1805 to 1950, the European influence on the architecture of Kerala resulted in the emergence of the colonial style which managed to establish a continuum of the traditional architecture. The paper focuses on the identification of the components of architecture that established the continuity of place-identity in the architecture of Kerala and examines the transition from the traditional Kerala architecture to colonial architecture during the colonial period. Visual surveys based on the principles of urban design, cognitive mapping, typology analysis followed by the strong understanding of the morphological and built environment along with the matrix method are the research tools used. The understanding of these components of continuity can be useful in creating buildings which people can relate to in the present day. South-Asia shares the history of colonialism and the understanding of these components can pave the way for further research on how to establish a regional identity in the era of globalization.

Keywords: colonial, identity, place, regional

Procedia PDF Downloads 390
637 Recreating Old Gardens, a Dynamic and Sustainable Design Pattern for Urban Green Spaces, Case Study: Persian Garden

Authors: Mina Sarabi, Dariush Sattarzadeh, Mitra Asadollahi Oula

Abstract:

In the old days, gardens reflect the identity and culture of each country. Persian garden in urban planning and architecture has a high position and it is a kind of paradise in Iranian opinion. But nowadays, the gardens were replaced with parks and urban open spaces. On the other hand, due to the industrial development of cities and increasing air pollution in urban environments, living in this spaces make problem for people. And improving ecological conditions will be felt more than ever. The purposes of this study are identification and reproduction of Persian garden pattern and adaptation of it with sustainability features in green spaces in contemporary cities and developing meaningful green spaces instead of designing aimless spaces in urban environment. The research method in this article is analytical and descriptive. Studying and collecting information about Iranian garden pattern is referring to library documents, articles and analysis case studies. The result reveals that Persian garden was the main factor the bond between man and nature. But in the last century, this relationship is in trouble. It has a significant impact in reducing the adverse effects of urban air pollution, noise and etc as well. Nowadays, recreated pattern of Iranian gardens in urban green spaces not only keep Iranian identity for future generations but also, using the principles of sustainability can play an important role in sustainable development and quality space of a city.

Keywords: green open spaces, nature, Persian garden, urban sustainability

Procedia PDF Downloads 225
636 A Method for Evaluating the Mechanical Stress on Mandibular Advancement Devices

Authors: Tsung-yin Lin, Yi-yu Lee, Ching-hua Hung

Abstract:

Snoring, the lay term for obstructive breathing during sleep, is one of the most prevalent of obnoxious human habits. Loud snoring usually makes others feel noisy and uncomfortable. Snoring also influences the sleep quality of snorers’ bed partners, because of the noise they do not get to sleep easily. Snoring causes the reduce of sleep quality leading to several medical problems, such as excessive daytime sleepiness, high blood pressure, increased risk for cardiovascular disease and cerebral vascular accident, and etc. There are many non-prescription devices offered for sale on the market, but very limited data are available to support a beneficial effect of these devices on snoring and use in treating obstructive sleep apnea (OSA). Mandibular advancement devices (MADs), also termed as the Mandibular reposition devices (MRDs) are removable devices which are worn at night during sleep. Most devices require dental impression, bite registration, and fabrication by a dental laboratory. Those devices are fixed to upper and lower teeth and are adjusted to advance the mandible. The amount of protrusion is adjusted to meet the therapeutic requirements, comfort, and tolerance. Many devices have a fixed degree of advancement. Some are adjustable in a limited degree. This study focuses on the stress analysis of Mandibular Advancement Devices (MADs), which are considered as a standard treatment of snoring that promoted by American Academy of Sleep Medicine (AASM). This paper proposes a new MAD design, and the finite element analysis (FEA) is introduced to precede the stress simulation for this MAD.

Keywords: finite element analysis, mandibular advancement devices, mechanical stress, snoring

Procedia PDF Downloads 346
635 Wind Power Mapping and NPV of Embedded Generation Systems in Nigeria

Authors: Oluseyi O. Ajayi, Ohiose D. Ohijeagbon, Mercy Ogbonnaya, Ameh Attabo

Abstract:

The study assessed the potential and economic viability of stand-alone wind systems for embedded generation, taking into account its benefits to small off-grid rural communities at 40 meteorological sites in Nigeria. A specific electric load profile was developed to accommodate communities consisting of 200 homes, a school and a community health centre. This load profile was incorporated within the distributed generation analysis producing energy in the MW range, while optimally meeting daily load demand for the rural communities. Twenty-four years (1987 to 2010) of wind speed data at a height of 10m utilized for the study were sourced from the Nigeria Meteorological Department, Oshodi. The HOMER® software optimizing tool was engaged for the feasibility study and design. Each site was suited to 3MW wind turbines in sets of five, thus 15MW was designed for each site. This design configuration was adopted in order to easily compare the distributed generation system amongst the sites to determine their relative economic viability in terms of life cycle cost, as well as levelised cost of producing energy. A net present value was estimated in terms of life cycle cost for 25 of the 40 meteorological sites. On the other hand, the remaining sites yielded a net present cost; meaning the installations at these locations were not economically viable when utilizing the present tariff regime for embedded generation in Nigeria.

Keywords: wind speed, wind power, distributed generation, cost per kilowatt-hour, clean energy, Nigeria

Procedia PDF Downloads 383
634 Genetic Characterization of a Composite Transposon Carrying armA and Aac(6)-Ib Genes in an Escherichia coli Isolate from Egypt

Authors: Omneya M. Helmy, Mona T. Kashef

Abstract:

Aminoglycosides are used in treating a wide range of infections caused by both Gram-negative and Gram positive bacteria. The presence of 16S rRNA methyl transferases (16S-RMTase) is among the newly discovered resistance mechanisms that confer high resistance to clinically useful aminoglycosides. Cephalosporins are the most commonly used antimicrobials in Egypt; therefore, this study was conducted to determine the isolation frequency of 16S rRNA methyl transferases among third generation cephalosporin-resistant clinical isolates in Egypt. One hundred and twenty three cephalosporin resistant Gram-negative clinical isolates were screened for aminoglycoside resistance by the Kirby Bauer disk diffusion method and tested for possible production of 16S-RMTase. PCR testing and sequencing were used to confirm the presence of 16S-RMTase and the associated antimicrobial resistance determinants, as well as the genetic region surrounding the armA gene. Out of 123 isolates, 66 (53.66%) were resistant to at least one aminoglycoside antibiotic. Only one Escherichia coli isolate (E9ECMO) which was totally resistant to all tested aminoglycosides, was confirmed to have the armA gene in association with blaTEM-1, blaCTX-M-15, blaCTX-M-14 and aac(6)-Ib genes. The armA gene was found to be carried on a large A/C plasmid. Genetic mapping of the armA surrounding region revealed, for the first time, the association of armA with aac(6)-Ib on the same transposon. In Conclusion, the isolation frequency of 16S-RMTase was low among the tested cephalosporin-resistant clinical samples. However, a novel composite transposon has been detected conferring high-level aminoglycosides resistance.

Keywords: aminoglcosides, armA gene, β lactmases, 16S rRNA methyl transferases

Procedia PDF Downloads 264
633 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System

Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi

Abstract:

Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.

Keywords: channel estimation, OFDM, pilot-assist, VLC

Procedia PDF Downloads 161
632 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 80
631 Development of a Few-View Computed Tomographic Reconstruction Algorithm Using Multi-Directional Total Variation

Authors: Chia Jui Hsieh, Jyh Cheng Chen, Chih Wei Kuo, Ruei Teng Wang, Woei Chyn Chu

Abstract:

Compressed sensing (CS) based computed tomographic (CT) reconstruction algorithm utilizes total variation (TV) to transform CT image into sparse domain and minimizes L1-norm of sparse image for reconstruction. Different from the traditional CS based reconstruction which only calculates x-coordinate and y-coordinate TV to transform CT images into sparse domain, we propose a multi-directional TV to transform tomographic image into sparse domain for low-dose reconstruction. Our method considers all possible directions of TV calculations around a pixel, so the sparse transform for CS based reconstruction is more accurate. In 2D CT reconstruction, we use eight-directional TV to transform CT image into sparse domain. Furthermore, we also use 26-directional TV for 3D reconstruction. This multi-directional sparse transform method makes CS based reconstruction algorithm more powerful to reduce noise and increase image quality. To validate and evaluate the performance of this multi-directional sparse transform method, we use both Shepp-Logan phantom and a head phantom as the targets for reconstruction with the corresponding simulated sparse projection data (angular sampling interval is 5 deg and 6 deg, respectively). From the results, the multi-directional TV method can reconstruct images with relatively less artifacts compared with traditional CS based reconstruction algorithm which only calculates x-coordinate and y-coordinate TV. We also choose RMSE, PSNR, UQI to be the parameters for quantitative analysis. From the results of quantitative analysis, no matter which parameter is calculated, the multi-directional TV method, which we proposed, is better.

Keywords: compressed sensing (CS), low-dose CT reconstruction, total variation (TV), multi-directional gradient operator

Procedia PDF Downloads 240
630 Setting the Baseline for a Sentinel System for the Identification of Occupational Risk Factors in Africa

Authors: Menouni Aziza, Chbihi Kaoutar, Duca Radu Corneliu, Gilissen Liesbeth, Bounou Salim, Godderis Lode, El Jaafari Samir

Abstract:

In Africa, environmental and occupational health risks are mostly underreported. The aim of this research is to develop and implement a sentinel surveillance system comprising training and guidance of occupational physicians (OC) who will report new work-related diseases in African countries. A group of 30 OC are recruited and trained in each of the partner countries (Morocco, Benin and Ethiopia). Each committed OC is asked to recruit 50 workers during a consultation in a time-frame of 6 months (1500 workers per country). Workers are asked to fill out an online questionnaire about their health status and work conditions, including exposure to 20 chemicals. Urine and blood samples are then collected for human biomonitoring of common exposures. Some preliminary results showed that 92% of the employees surveyed are exposed to physical constraints, 44% to chemical agents, and 24% to biological agents. The most common physical constraints are manual handling of loads, noise pollution and thermal pollution. The most frequent chemical risks are exposure to pesticides and fuels. This project will allow a better understanding of effective sentinel systems as a promising method to gather high quality data, which can support policy-making in terms of preventing emerging work-related diseases.

Keywords: sentinel system, occupational diseases, human biomonitoring, Africa

Procedia PDF Downloads 64
629 Mitigation of Interference in Satellite Communications Systems via a Cross-Layer Coding Technique

Authors: Mario A. Blanco, Nicholas Burkhardt

Abstract:

An important problem in satellite communication systems which operate in the Ka and EHF frequency bands consists of the overall degradation in link performance of mobile terminals due to various types of degradations in the link/channel, such as fading, blockage of the link to the satellite (especially in urban environments), intentional as well as other types of interference, etc. In this paper, we focus primarily on the interference problem, and we develop a very efficient and cost-effective solution based on the use of fountain codes. We first introduce a satellite communications (SATCOM) terminal uplink interference channel model that is classically used against communication systems that use spread-spectrum waveforms. We then consider the use of fountain codes, with focus on Raptor codes, as our main mitigation technique to combat the degradation in link/receiver performance due to the interference signal. The performance of the receiver is obtained in terms of average probability of bit and message error rate as a function of bit energy-to-noise density ratio, Eb/N0, and other parameters of interest, via a combination of analysis and computer simulations, and we show that the use of fountain codes is extremely effective in overcoming the effects of intentional interference on the performance of the receiver and associated communication links. We then show this technique can be extended to mitigate other types of SATCOM channel degradations, such as those caused by channel fading, shadowing, and hard-blockage of the uplink signal.

Keywords: SATCOM, interference mitigation, fountain codes, turbo codes, cross-layer

Procedia PDF Downloads 341
628 Digital Twin Smart Hospital: A Guide for Implementation and Improvements

Authors: Enido Fabiano de Ramos, Ieda Kanashiro Makiya, Francisco I. Giocondo Cesar

Abstract:

This study investigates the application of Digital Twins (DT) in Smart Hospital Environments (SHE), through a bibliometric study and literature review, including comparison with the principles of Industry 4.0. It aims to analyze the current state of the implementation of digital twins in clinical and non-clinical operations in healthcare settings, identifying trends and challenges, comparing these practices with Industry 4.0 concepts and technologies, in order to present a basic framework including stages and maturity levels. The bibliometric methodology will allow mapping the existing scientific production on the theme, while the literature review will synthesize and critically analyze the relevant studies, highlighting pertinent methodologies and results, additionally the comparison with Industry 4.0 will provide insights on how the principles of automation, interconnectivity and digitalization can be applied in healthcare environments/operations, aiming at improvements in operational efficiency and quality of care. The results of this study will contribute to a deeper understanding of the potential of Digital Twins in Smart Hospitals, in addition to the future potential from the effective integration of Industry 4.0 concepts in this specific environment, presented through the practical framework, after all, the urgent need for changes addressed in this article is undeniable, as well as all their value contribution to human sustainability, designed in SDG3 – Health and well-being: ensuring that all citizens have a healthy life and well-being, at all ages and in all situations. We know that the validity of these relationships will be constantly discussed, and technology can always change the rules of the game.

Keywords: digital twin, smart hospital, healthcare operations, industry 4.0, SDG3, technology

Procedia PDF Downloads 36
627 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City

Authors: Christian Kapuku, Seung-Young Kho

Abstract:

An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.

Keywords: geographic information system (GIS), network construction, transportation database, open source data

Procedia PDF Downloads 153
626 Evaluation and Possibilities of Valorization of Ecotourism Potentials in the Mbam and Djerem National Park

Authors: Rinyu Shei Mercy

Abstract:

Protected areas are the potential areas for the development of ecotourism because of their biodiversity, landscapes, waterfalls, lakes, caves, salt lick and cultural heritage of local or indigenous people. These potentials have not yet been valorized, so this study will enable to investigate the evaluation and possibilities of valorization of ecotourism potentials in the Mbam and Djerem National Park. Hence, this was done by employing a combination of field observations, examination, data collection and evaluation, using a SWOT analysis. The SWOT provides an analysis to determine the strengths, weaknesses, opportunities and threats, and strategic suggestions for ecological planning. The study helps to determine an ecotouristic inventory and mapping of ecotourism potentials of the park, evaluate the degree of valorization of these potentials and the possibilities of valorization. Finally, the study has proven that the park has much natural potentials such as rivers, salt licks, waterfall and rapids, lakes, caves and rocks, etc. Also, from the study, it was realized that as concerns the degree of valorization of these ecotourism potentials, 50% of the population visit the salt lick of Pkayere because it’s a biodiversity hotspot and rich in mineral salt attracting a lot of animals and the least is the lake Miyere with 1% due to the fact that it is sacred. Moreover, from the results, there are possibilities that these potentials can be valorized and put into use because of their attractive nature such as creating good roads and bridges, good infrastructural facilities, good communication network etc. So, the study recommends that, in this process, MINTOUR, WCS, tour operators must interact sufficiently in order to develop the potential interest to ecotourism, ecocultural tourism and scientific tourism.

Keywords: ecotourism, national park Mbam and Djerem, valorization of biodiversity, protected areas of Cameroon

Procedia PDF Downloads 117
625 Application of the Urban Forest Credit Standard as a Tool for Compensating CO2 Emissions in the Metalworking Industry: A Case Study in Brazil

Authors: Marie Madeleine Sarzi Inacio, Ligiane Carolina Leite Dauzacker, Rodrigo Henriques Lopes Da Silva

Abstract:

The climate changes resulting from human activity have increased interest in more sustainable production practices to reduce and offset pollutant emissions. Brazil, with its vast areas capable of carbon absorption, holds a significant advantage in this context. However, to optimize the country's sustainable potential, it is important to establish a robust carbon market with clear rules for the eligibility and validation of projects aimed at reducing and offsetting Greenhouse Gas (GHG) emissions. In this study, our objective is to conduct a feasibility analysis through a case study to evaluate the implementation of an urban forest credits standard in Brazil, using the Urban Forest Credits (UFC) model implemented in the United States as a reference. Thus, the city of Ribeirão Preto, located in Brazil, was selected to assess the availability of green areas. With the CO2 emissions value from the metalworking industry, it was possible to analyze information in the case study, considering the activity. The QGIS software was used to map potential urban forest areas, which can connect to various types of geospatial databases. Although the chosen municipality has little vegetative coverage, the mapping identified at least eight areas that fit the standard definitions within the delimited urban perimeter. The outlook was positive, and the implementation of projects like Urban Forest Credits (UFC) adapted to the Brazilian reality has great potential to benefit the country in the carbon market and contribute to achieving its Greenhouse Gas (GHG) emission reduction goals.

Keywords: carbon neutrality, metalworking industry, carbon credits, urban forestry credits

Procedia PDF Downloads 58
624 Classification of EEG Signals Based on Dynamic Connectivity Analysis

Authors: Zoran Šverko, Saša Vlahinić, Nino Stojković, Ivan Markovinović

Abstract:

In this article, the classification of target letters is performed using data from the EEG P300 Speller paradigm. Neural networks trained with the results of dynamic connectivity analysis between different brain regions are used for classification. Dynamic connectivity analysis is based on the adaptive window size and the imaginary part of the complex Pearson correlation coefficient. Brain dynamics are analysed using the relative intersection of confidence intervals for the imaginary component of the complex Pearson correlation coefficient method (RICI-imCPCC). The RICI-imCPCC method overcomes the shortcomings of currently used dynamical connectivity analysis methods, such as the low reliability and low temporal precision for short connectivity intervals encountered in constant sliding window analysis with wide window size and the high susceptibility to noise encountered in constant sliding window analysis with narrow window size. This method overcomes these shortcomings by dynamically adjusting the window size using the RICI rule. This method extracts information about brain connections for each time sample. Seventy percent of the extracted brain connectivity information is used for training and thirty percent for validation. Classification of the target word is also done and based on the same analysis method. As far as we know, through this research, we have shown for the first time that dynamic connectivity can be used as a parameter for classifying EEG signals.

Keywords: dynamic connectivity analysis, EEG, neural networks, Pearson correlation coefficients

Procedia PDF Downloads 188
623 Design and Analysis of Crankshaft Using Al-Al2O3 Composite Material

Authors: Palanisamy Samyraj, Sriram Yogesh, Kishore Kumar, Vaishak Cibi

Abstract:

The project is about design and analysis of crankshaft using Al-Al2O3 composite material. The project is mainly concentrated across two areas one is to design and analyze the composite material, and the other is to work on the practical model. Growing competition and the growing concern for the environment has forced the automobile manufactures to meet conflicting demands such as increased power and performance, lower fuel consumption, lower pollution emission and decrease noise and vibration. Metal matrix composites offer good properties for a number of automotive components. The work reports on studies on Al-Al2O3 as the possible alternative material for a crank shaft. These material have been considered for use in various components in engines due to the high amount of strength to weight ratio. These materials are significantly taken into account for their light weight, high strength, high specific modulus, low co-efficient of thermal expansion, good air resistance properties. In addition high specific stiffness, superior high temperature, mechanical properties and oxidation resistance of Al2O3 have developed some advanced materials that are Al-Al2O3 composites. Crankshafts are used in automobile industries. Crankshaft is connected to the connecting rod for the movement of the piston which is subjected to high stresses which cause the wear of the crankshaft. Hence using composite material in crankshaft gives good fuel efficiency, low manufacturing cost, less weight.

Keywords: metal matrix composites, Al-Al2O3, high specific modulus, strength to weight ratio

Procedia PDF Downloads 257
622 Spatial Object-Oriented Template Matching Algorithm Using Normalized Cross-Correlation Criterion for Tracking Aerial Image Scene

Authors: Jigg Pelayo, Ricardo Villar

Abstract:

Leaning on the development of aerial laser scanning in the Philippine geospatial industry, researches about remote sensing and machine vision technology became a trend. Object detection via template matching is one of its application which characterized to be fast and in real time. The paper purposely attempts to provide application for robust pattern matching algorithm based on the normalized cross correlation (NCC) criterion function subjected in Object-based image analysis (OBIA) utilizing high-resolution aerial imagery and low density LiDAR data. The height information from laser scanning provides effective partitioning order, thus improving the hierarchal class feature pattern which allows to skip unnecessary calculation. Since detection is executed in the object-oriented platform, mathematical morphology and multi-level filter algorithms were established to effectively avoid the influence of noise, small distortion and fluctuating image saturation that affect the rate of recognition of features. Furthermore, the scheme is evaluated to recognized the performance in different situations and inspect the computational complexities of the algorithms. Its effectiveness is demonstrated in areas of Misamis Oriental province, achieving an overall accuracy of 91% above. Also, the garnered results portray the potential and efficiency of the implemented algorithm under different lighting conditions.

Keywords: algorithm, LiDAR, object recognition, OBIA

Procedia PDF Downloads 228
621 Voice Liveness Detection Using Kolmogorov Arnold Networks

Authors: Arth J. Shah, Madhu R. Kamble

Abstract:

Voice biometric liveness detection is customized to certify an authentication process of the voice data presented is genuine and not a recording or synthetic voice. With the rise of deepfakes and other equivalently sophisticated spoofing generation techniques, it’s becoming challenging to ensure that the person on the other end is a live speaker or not. Voice Liveness Detection (VLD) system is a group of security measures which detect and prevent voice spoofing attacks. Motivated by the recent development of the Kolmogorov-Arnold Network (KAN) based on the Kolmogorov-Arnold theorem, we proposed KAN for the VLD task. To date, multilayer perceptron (MLP) based classifiers have been used for the classification tasks. We aim to capture not only the compositional structure of the model but also to optimize the values of univariate functions. This study explains the mathematical as well as experimental analysis of KAN for VLD tasks, thereby opening a new perspective for scientists to work on speech and signal processing-based tasks. This study emerges as a combination of traditional signal processing tasks and new deep learning models, which further proved to be a better combination for VLD tasks. The experiments are performed on the POCO and ASVSpoof 2017 V2 database. We used Constant Q-transform, Mel, and short-time Fourier transform (STFT) based front-end features and used CNN, BiLSTM, and KAN as back-end classifiers. The best accuracy is 91.26 % on the POCO database using STFT features with the KAN classifier. In the ASVSpoof 2017 V2 database, the lowest EER we obtained was 26.42 %, using CQT features and KAN as a classifier.

Keywords: Kolmogorov Arnold networks, multilayer perceptron, pop noise, voice liveness detection

Procedia PDF Downloads 3
620 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach

Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed

Abstract:

Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.

Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model

Procedia PDF Downloads 441