Search results for: 99.95% IoT data transmission savings
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26849

Search results for: 99.95% IoT data transmission savings

25379 Data Hiding by Vector Quantization in Color Image

Authors: Yung Gi Wu

Abstract:

With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.

Keywords: data hiding, vector quantization, watermark, color image

Procedia PDF Downloads 364
25378 An Ultra-Low Output Impedance Power Amplifier for Tx Array in 7-Tesla Magnetic Resonance Imaging

Authors: Ashraf Abuelhaija, Klaus Solbach

Abstract:

In Ultra high-field MRI scanners (3T and higher), parallel RF transmission techniques using multiple RF chains with multiple transmit elements are a promising approach to overcome the high-field MRI challenges in terms of inhomogeneity in the RF magnetic field and SAR. However, mutual coupling between the transmit array elements disturbs the desirable independent control of the RF waveforms for each element. This contribution demonstrates a 18 dB improvement of decoupling (isolation) performance due to the very low output impedance of our 1 kW power amplifier.

Keywords: EM coupling, inter-element isolation, magnetic resonance imaging (mri), parallel transmit

Procedia PDF Downloads 495
25377 1-Butyl-2,3-Dimethylimidazolium Bis (Trifluoromethanesulfonyl) Imide and Titanium Oxide Based Voltammetric Sensor for the Quantification of Flunarizine Dihydrochloride in Solubilized Media

Authors: Rajeev Jain, Nimisha Jadon, Kshiti Singh

Abstract:

Titanium oxide nanoparticles and 1-butyl-2,3-dimethylimidazolium bis (trifluoromethane- sulfonyl) imide modified glassy carbon electrode (TiO2/IL/GCE) has been fabricated for electrochemical sensing of flunarizine dihydrochloride (FRH). The electrochemical properties and morphology of the prepared nanocomposite were studied by electrochemical impedance spectroscopy (EIS) and transmission electron microscopy (TEM). The response of the electrochemical sensor was found to be proportional to the concentrations of FRH in the range from 0.5 µg mL-1 to 16 µg mL-1. The detection limit obtained was 0.03 µg mL-1. The proposed method was also applied to the determination of FRH in pharmaceutical formulation and human serum with good recoveries.

Keywords: flunarizine dihydrochloride, ionic liquid, nanoparticles, voltammetry, human serum

Procedia PDF Downloads 329
25376 Evaluation of Prehabilitation Prior to Surgery for an Orthopaedic Pathway

Authors: Stephen McCarthy, Joanne Gray, Esther Carr, Gerard Danjoux, Paul Baker, Rhiannon Hackett

Abstract:

Background: The Go Well Health (GWH) platform is a web-based programme that allows patients to access personalised care plans and resources, aimed at prehabilitation prior to surgery. The online digital platform delivers essential patient education and support for patients prior to undergoing total hip replacements (THR) and total knee replacements (TKR). This study evaluated the impact of an online digital platform (ODP) in terms of functional health outcomes, health related quality of life and hospital length of stay following surgery. Methods: A retrospective cohort study comparing a cohort of patients who used the online digital platform (ODP) to deliver patient education and support (PES) prior to undergoing THR and TKR surgery relative to a cohort of patients who did not access the ODP and received usual care. Routinely collected Patient Reported Outcome Measures (PROMs) data was obtained on 2,406 patients who underwent a knee replacement (n=1,160) or a hip replacement (n=1,246) between 2018 and 2019 in a single surgical centre in the United Kingdom. The Oxford Hip and Knee Score and the European Quality of Life Five-Dimensional tool (EQ5D-5L) was obtained both pre-and post-surgery (at 6 months) along with hospital LOS. Linear regression was used to compare the estimate the impact of GWH on both health outcomes and negative binomial regressions were used to impact on LOS. All analyses adjusted for age, sex, Charlson Comorbidity Score and either pre-operative Oxford Hip/Knee scores or pre-operative EQ-5D scores. Fractional polynomials were used to represent potential non-linear relationships between the factors included in the regression model. Findings: For patients who underwent a knee replacement, GWH had a statistically significant impact on Oxford Knee Scores and EQ5D-5L utility post-surgery (p=0.039 and p=0.002 respectively). GWH did not have a statistically significant impact on the hospital length of stay. For those patients who underwent a hip replacement, GWH had a statistically significant impact on Oxford Hip Scores and EQ5D-5L utility post (p=0.000 and p=0.009 respectively). GWH also had a statistically significant reduction in the hospital length of stay (p=0.000). Conclusion: Health Outcomes were higher for patients who used the GWH platform and underwent THR and TKR relative to those who received usual care prior to surgery. Patients who underwent a hip replacement and used GWH also had a reduced hospital LOS. These findings are important for health policy and or decision makers as they suggest that prehabilitation via an ODP can maximise health outcomes for patients following surgery whilst potentially making efficiency savings with reductions in LOS.

Keywords: digital prehabilitation, online digital platform, orthopaedics, surgery

Procedia PDF Downloads 190
25375 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 194
25374 Effect of the Applied Bias on Miniband Structures in Dimer Fibonacci Inas/Ga1-Xinxas Superlattices

Authors: Z. Aziz, S. Terkhi, Y. Sefir, R. Djelti, S. Bentata

Abstract:

The effect of a uniform electric field across multibarrier systems (InAs/InxGa1-xAs) is exhaustively explored by a computational model using exact airy function formalism and the transfer-matrix technique. In the case of biased DFHBSL structure a strong reduction in transmission properties was observed and the width of the miniband structure linearly decreases with the increase of the applied bias. This is due to the confinement of the states in the miniband structure, which becomes increasingly important (Wannier-Stark Effect).

Keywords: dimer fibonacci height barrier superlattices, singular extended state, exact airy function, transfer matrix formalism

Procedia PDF Downloads 306
25373 Feasibility Study and Experiment of On-Site Nuclear Material Identification in Fukushima Daiichi Fuel Debris by Compact Neutron Source

Authors: Yudhitya Kusumawati, Yuki Mitsuya, Tomooki Shiba, Mitsuru Uesaka

Abstract:

After the Fukushima Daiichi nuclear power reactor incident, there are a lot of unaccountable nuclear fuel debris in the reactor core area, which is subject to safeguard and criticality safety. Before the actual precise analysis is performed, preliminary on-site screening and mapping of nuclear debris activity need to be performed to provide a reliable data on the nuclear debris mass-extraction planning. Through a collaboration project with Japan Atomic Energy Agency, an on-site nuclear debris screening system by using dual energy X-Ray inspection and neutron energy resonance analysis has been established. By using the compact and mobile pulsed neutron source constructed from 3.95 MeV X-Band electron linac, coupled with Tungsten as electron-to-photon converter and Beryllium as a photon-to-neutron converter, short-distance neutron Time of Flight measurement can be performed. Experiment result shows this system can measure neutron energy spectrum up to 100 eV range with only 2.5 meters Time of Flightpath in regards to the X-Band accelerator’s short pulse. With this, on-site neutron Time of Flight measurement can be used to identify the nuclear debris isotope contents through Neutron Resonance Transmission Analysis (NRTA). Some preliminary NRTA experiments have been done with Tungsten sample as dummy nuclear debris material, which isotopes Tungsten-186 has close energy absorption value with Uranium-238 (15 eV). The results obtained shows that this system can detect energy absorption in the resonance neutron area within 1-100 eV. It can also detect multiple elements in a material at once with the experiment using a combined sample of Indium, Tantalum, and silver makes it feasible to identify debris containing mixed material. This compact neutron Time of Flight measurement system is a great complementary for dual energy X-Ray Computed Tomography (CT) method that can identify atomic number quantitatively but with 1-mm spatial resolution and high error bar. The combination of these two measurement methods will able to perform on-site nuclear debris screening at Fukushima Daiichi reactor core area, providing the data for nuclear debris activity mapping.

Keywords: neutron source, neutron resonance, nuclear debris, time of flight

Procedia PDF Downloads 238
25372 Impact of Primary Care Telemedicine Consultations On Health Care Resource Utilisation: A Systematic Review

Authors: Anastasia Constantinou, Stephen Morris

Abstract:

Background: The adoption of synchronous and asynchronous telemedicine modalities for primary care consultations has exponentially increased since the COVID-19 pandemic. However, there is limited understanding of how virtual consultations influence healthcare resource utilization and other quality measures including safety, timeliness, efficiency, patient and provider satisfaction, cost-effectiveness and environmental impact. Aim: Quantify the rate of follow-up visits, emergency department visits, hospitalizations, request for investigations and prescriptions and comment on the effect on different quality measures associated with different telemedicine modalities used for primary care services and primary care referrals to secondary care Design and setting: Systematic review in primary care Methods: A systematic search was carried out across three databases (Medline, PubMed and Scopus) between August and November 2023, using terms related to telemedicine, general practice, electronic referrals, follow-up, use and efficiency and supported by citation searching. This was followed by screening according to pre-defined criteria, data extraction and critical appraisal. Narrative synthesis and metanalysis of quantitative data was used to summarize findings. Results: The search identified 2230 studies; 50 studies are included in this review. There was a prevalence of asynchronous modalities in both primary care services (68%) and referrals from primary care to secondary care (83%), and most of the study participants were females (63.3%), with mean age of 48.2. The average follow-up for virtual consultations in primary care was 28.4% (eVisits: 36.8%, secure messages 18.7%, videoconference 23.5%) with no significant difference between them or F2F consultations. There was an average annual reduction of primary care visits by 0.09/patient, an increase in telephone visits by 0.20/patient, an increase in ED encounters by 0.011/patient, an increase in hospitalizations by 0.02/patient and an increase in out of hours visits by 0.019/patient. Laboratory testing was requested on average for 10.9% of telemedicine patients, imaging or procedures for 5.6% and prescriptions for 58.7% of patients. When looking at referrals to secondary care, on average 36.7% of virtual referrals required follow-up visit, with the average rate of follow-up for electronic referrals being higher than for videoconferencing (39.2% vs 23%, p=0.167). Technical failures were reported on average for 1.4% of virtual consultations to primary care. When using carbon footprint estimates, we calculate that the use of telemedicine in primary care services can potentially provide a net decrease in carbon footprint by 0.592kgCO2/patient/year. When follow-up rates are taken into account, we estimate that virtual consultations reduce carbon footprint for primary care services by 2.3 times, and for secondary care referrals by 2.2 times. No major concerns regarding quality of care, or patient satisfaction were identified. 5/7 studies that addressed cost-effectiveness, reported increased savings. Conclusions: Telemedicine provides quality, cost-effective, and environmentally sustainable care for patients in primary care with inconclusive evidence regarding the rates of subsequent healthcare utilization. The evidence is limited by heterogeneous, small-scale studies and lack of prospective comparative studies. Further research to identify the most appropriate telemedicine modality for different patient populations, clinical presentations, service provision (e.g. used to follow-up patients instead of initial diagnosis) as well as further education for patients and providers alike on how to make best use of this service is expected to improve outcomes and influence practice.

Keywords: telemedicine, healthcare utilisation, digital interventions, environmental impact, sustainable healthcare

Procedia PDF Downloads 57
25371 Intelligent Parking Systems for Quasi-Close Communities

Authors: Ayodele Adekunle Faiyetole, Olumide Olawale Jegede

Abstract:

This paper presents the experimental design and needs justifications for a localized intelligent parking system (L-IPS), ideal for quasi-close communities with increasing vehicular volume that depends on limited or constant parking facilities. For a constant supply in parking facilities, the demand for an increasing vehicular volume could lead to poor time conservation or extended travel time, traffic congestion or impeded mobility, and safety issues. Increased negative environmental and economic externalities are other associated and consequent downsides of disparities in demand and supply. This L-IPS is designed using a microcontroller, ultrasonic sensors, LED indicators, such that the current status, in terms of parking spots availability, can be known from the main entrance to the community or a parking zone on a LCD screen. As an advanced traffic management system (ATMS), the L-IPS is designed to resolve aspects of infrastructure-to-driver (I2D) communication and parking detection issues. Thus, this L-IPS can act as a timesaver for users by helping them know the availability of parking spots. Providing on-time, informed routing, to a next preference or seamless moving to berth on the available spot on a proximate facility as the case may be. Its use could also increase safety and increase mobility, and fuel savings and costs, therefore, reducing negative environmental and economic externalities due to transportation systems.

Keywords: intelligent parking systems, localized intelligent parking system, intelligent transport systems, advanced traffic management systems, infrastructure-to-drivers communication

Procedia PDF Downloads 171
25370 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 315
25369 Histopatological Analysis of Vital Organs in Cattle Infected with Lumpy Skin Disease in Rajasthan, India

Authors: Manisha, Manisha Mathur, Jay K. Desai, Shesh Asopa, Manisha Mehra

Abstract:

The present study was carried out for the comprehensive analysis of lumpy skin disease (LSD) in cattle and to elucidate the histopathology of vital organs in natural outbreaks. Lumpy skin disease (LSD) is a viral infection that primarily affects cattle. It is caused by a Capri pox virus and is characterized by the formation of skin nodules or lesions. For this study, a postmortem of 20 cows who died of Lumpy skin disease in different regions of Rajasthan was conducted. This study aimed to examine a cow's external and internal organs to confirm if lumpy skin disease was the cause of death. Accurate diagnosis is essential for improving disease surveillance, understanding the disease's progression, and informing control measures. Pathological examinations reveal virus-induced changes across organs, while histopathological analyses provide crucial insights into the disease's pathogenesis, aiding in the development of advanced diagnostics and effective prevention strategies. Histopathological examination of nodular skin lesions revealed edema, hyperemia, acanthosis, severe hydropic degeneration/ballooning degeneration, and hyperkeratosis in the epidermis. In the lungs, congestion, oedema, emphysema, and atelectasis were observed grossly. Microscopically changes were suggestive of interstitial pneumonia, suppurative pneumonia, bronchopneumonia post pneumonic fibrosis, and stage of resolution. Grossely liver showed congestion and necrotic foci microscopically in most of the cases, and the liver showed acute viral hepatitis. Microscopically in kidneys, multifocal interstitial nephritis was observed. There was marked interstitial inflammation and zonal fibrosis with cystically dilated tubules and bowman's capsules. Microscopically, most of the heart tissue section showed normal histology with few sarcocysts in between cardiac muscles. In some cases, loss of cross striation, sarcoplasmic vacuolation, fregmentation, and disintegration of cardiac fibres were observed. The present study revealed the characteristic gross and histopathological changes in different organs in natural cases of lumpy skin disease. Further, the disease was confirmed based on the molecular diagnosis and transmission electron microscopy of capripox infection in the affected cattle in the study area.

Keywords: Capripoxvirus, lumpy skin disease, polymerage chain reaction, transmission electron microscopy

Procedia PDF Downloads 25
25368 Experimental Evaluation of UDP in Wireless LAN

Authors: Omar Imhemed Alramli

Abstract:

As Transmission Control Protocol (TCP), User Datagram Protocol (UDP) is transfer protocol in the transportation layer in Open Systems Interconnection model (OSI model) or in TCP/IP model of networks. The UDP aspects evaluation were not recognized by using the pcattcp tool on the windows operating system platform like TCP. The study has been carried out to find a tool which supports UDP aspects evolution. After the information collection about different tools, iperf tool was chosen and implemented on Cygwin tool which is installed on both Windows XP platform and also on Windows XP on virtual box machine on one computer only. Iperf is used to make experimental evaluation of UDP and to see what will happen during the sending the packets between the Host and Guest in wired and wireless networks. Many test scenarios have been done and the major UDP aspects such as jitter, packet losses, and throughput are evaluated.

Keywords: TCP, UDP, IPERF, wireless LAN

Procedia PDF Downloads 354
25367 Additive Manufacturing of Microstructured Optical Waveguides Using Two-Photon Polymerization

Authors: Leonnel Mhuka

Abstract:

Background: The field of photonics has witnessed substantial growth, with an increasing demand for miniaturized and high-performance optical components. Microstructured optical waveguides have gained significant attention due to their ability to confine and manipulate light at the subwavelength scale. Conventional fabrication methods, however, face limitations in achieving intricate and customizable waveguide structures. Two-photon polymerization (TPP) emerges as a promising additive manufacturing technique, enabling the fabrication of complex 3D microstructures with submicron resolution. Objectives: This experiment aimed to utilize two-photon polymerization to fabricate microstructured optical waveguides with precise control over geometry and dimensions. The objective was to demonstrate the feasibility of TPP as an additive manufacturing method for producing functional waveguide devices with enhanced performance. Methods: A femtosecond laser system operating at a wavelength of 800 nm was employed for two-photon polymerization. A custom-designed CAD model of the microstructured waveguide was converted into G-code, which guided the laser focus through a photosensitive polymer material. The waveguide structures were fabricated using a layer-by-layer approach, with each layer formed by localized polymerization induced by non-linear absorption of the laser light. Characterization of the fabricated waveguides included optical microscopy, scanning electron microscopy, and optical transmission measurements. The optical properties, such as mode confinement and propagation losses, were evaluated to assess the performance of the additive manufactured waveguides. Conclusion: The experiment successfully demonstrated the additive manufacturing of microstructured optical waveguides using two-photon polymerization. Optical microscopy and scanning electron microscopy revealed the intricate 3D structures with submicron resolution. The measured optical transmission indicated efficient light propagation through the fabricated waveguides. The waveguides exhibited well-defined mode confinement and relatively low propagation losses, showcasing the potential of TPP-based additive manufacturing for photonics applications. The experiment highlighted the advantages of TPP in achieving high-resolution, customized, and functional microstructured optical waveguides. Conclusion: his experiment substantiates the viability of two-photon polymerization as an innovative additive manufacturing technique for producing complex microstructured optical waveguides. The successful fabrication and characterization of these waveguides open doors to further advancements in the field of photonics, enabling the development of high-performance integrated optical devices for various applications

Keywords: Additive Manufacturing, Microstructured Optical Waveguides, Two-Photon Polymerization, Photonics Applications

Procedia PDF Downloads 101
25366 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 75
25365 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 253
25364 Sensitivity of Acanthamoeba castellanii-Grown Francisella to Three Different Disinfectants

Authors: M. Knezevic, V. Marecic, M. Ozanic, I. Kelava, M. Mihelcic, M. Santic

Abstract:

Francisella tularensis is a highly infectious, gram-negative intracellular bacterium and the causative agent of tularemia. The bacterium has been isolated from more than 250 wild species, including protozoa cells. Since Francisella is very virulent and persists in the environment for years, the aim of this study was to investigate whether Acanthamoeba castellanii-grown F. novicida exhibits an alteration in the resistance to disinfectants. It has been shown by other intracellular pathogens, including Legionella pneumophila that bacteria grown in amoeba exhibit more resistance to disinfectants. However, there is no data showing Francisella viability behaviour after intracellular life cycle in A. castellani. In this study, the bacterial suspensions of A. castellanii-grown or in vitro-grown Francisella were treated with three different disinfectants, and the bacterial viability after disinfection treatment was determined by a colony-forming unit (CFU) counting method, transmission electron microscopy (TEM), fluorescence microscopy as well as the leakage of intracellular fluid. Our results have shown that didecyldimethylammonium chloride (DDAC) combined with isopropyl alcohol was the most effective in bacterial killing; all in vitro-grown and A. castellanii-grown F. novicida were killed after only 10s. Surprisingly, in comparison to in vitro-grown bacteria, A. castellanii-grown F. novicida was more sensitive to decontamination by the benzalkonium chloride combined with DDAC and formic acid and the polyhexamethylene biguanide (PHMB). We can conclude that the tested disinfectants exhibit antimicrobial activity by causing a loss of structural organization and integrity of the Francisella cell wall and membrane and the subsequent leakage of the intracellular contents. Finally, the results of this study clearly demonstrate that Francisella grown in A. castellanii had become more susceptible to many disinfectants.

Keywords: Acanthamoeba, disinfectant, Francisella, sensitivity

Procedia PDF Downloads 101
25363 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 73
25362 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 160
25361 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 108
25360 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 422
25359 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 64
25358 To Design an Architectural Model for On-Shore Oil Monitoring Using Wireless Sensor Network System

Authors: Saurabh Shukla, G. N. Pandey

Abstract:

In recent times, oil exploration and monitoring in on-shore areas have gained much importance considering the fact that in India the oil import is 62 percent of the total imports. Thus, architectural model like wireless sensor network to monitor on-shore deep sea oil well is being developed to get better estimate of the oil prospects. The problem we are facing nowadays that we have very few restricted areas of oil left today. Countries like India don’t have much large areas and resources for oil and this problem with most of the countries that’s why it has become a major problem when we are talking about oil exploration in on-shore areas also the increase of oil prices has further ignited the problem. For this the use of wireless network system having relative simplicity, smallness in size and affordable cost of wireless sensor nodes permit heavy deployment in on-shore places for monitoring oil wells. Deployment of wireless sensor network in large areas will surely reduce the cost it will be very much cost effective. The objective of this system is to send real time information of oil monitoring to the regulatory and welfare authorities so that suitable action could be taken. This system architecture is composed of sensor network, processing/transmission unit and a server. This wireless sensor network system could remotely monitor the real time data of oil exploration and monitoring condition in the identified areas. For wireless sensor networks, the systems are wireless, have scarce power, are real-time, utilize sensors and actuators as interfaces, have dynamically changing sets of resources, aggregate behaviour is important and location is critical. In this system a communication is done between the server and remotely placed sensors. The server gives the real time oil exploration and monitoring conditions to the welfare authorities.

Keywords: sensor, wireless sensor network, oil, sensor, on-shore level

Procedia PDF Downloads 446
25357 Experimental Investigation of Mechanical Friction Influence in Semi-Hydraulic Clutch Actuation System Over Mileage

Authors: Abdul Azarrudin M. A., Pothiraj K., Kandasamy Satish

Abstract:

In the current automobile scenario, there comes a demand on more sophistication and comfort drive feel on passenger segments. The clutch pedal effort is one such customer touch feels in manual transmission vehicles, where the driver continuous to operate the clutch pedal in his entire the driving maneuvers. Hence optimum pedal efforts at green condition and over mileage to be ensured for fatigue free the driving. As friction is one the predominant factor and its tendency to challenge the technicality by causing the function degradation. One such semi-hydraulic systems shows load efficiency of about 70-75% over lifetime only due to the increase in friction which leads to the increase in pedal effort and cause fatigue to the vehicle driver. This work deals with the study of friction with different interfaces and its influence in the fulcrum points over mileage, with the objective of understanding the trend over mileage and determining the alternative ways of resolving it. In that one way of methodology is the reduction of friction by experimental investigation of various friction reduction interfaces like metal-to-metal interface and it has been tried out and is detailed further. Also, the specific attention has been put up considering the fulcrum load and its contact interfaces to move on with this study. The main results of the experimental data with the influence of three different contact interfaces are being presented with an ultimate intention of ending up into less fatigue with longer consistent pedal effort, thus smoothens the operation of the end user. The Experimental validation also has been done through rig-level test setup to depict the performance at static condition and in-parallel vehicle level test has also been performed to record the additional influences if any.

Keywords: automobile, clutch, friction, fork

Procedia PDF Downloads 124
25356 Performance Evaluation of DSR and OLSR Routing Protocols in MANET Using Varying Pause Time

Authors: Yassine Meraihi, Dalila Acheli, Rabah Meraihi

Abstract:

MANET for Mobile Ad hoc NETwork is a collection of wireless mobile nodes that communicates with each other without using any existing infrastructure, access point or centralized administration, due to the higher mobility and limited radio transmission range, routing is an important issue in ad hoc network, so in order to ensure reliable and efficient route between to communicating nodes quickly, an appropriate routing protocol is needed. In this paper, we present the performance analysis of two mobile ad hoc network routing protocols namely DSR and OLSR using NS2.34, the performance is determined on the basis of packet delivery ratio, throughput, average jitter and end to end delay with varying pause time.

Keywords: DSR, OLSR, quality of service, routing protocols, MANET

Procedia PDF Downloads 552
25355 A Cooperative, Autonomous, and Continuously Operating Drone System Offered to Railway and Bridge Industry: The Business Model Behind

Authors: Paolo Guzzini, Emad Samuel M. Ebeid

Abstract:

Bridges and Railways are critical infrastructures. Ensuring safety for transports using such assets is a primary goal as it directly impacts the lives of people. By the way, improving safety could require increased investments in O&M, and therefore optimizing resource usage for asset maintenance becomes crucial. Drones4Safety (D4S), a European project funded under the H2020 Research and Innovation Action (RIA) program, aims to increase the safety of the European civil transport by building a system that relies on 3 main pillars: • Drones operating autonomously in swarm mode; • Drones able to recharge themselves using inductive phenomena produced by transmission lines in the nearby of bridges and railways assets to be inspected; • Data acquired that are analyzed with AI-empowered algorithms for defect detection This paper describes the business model behind this disruptive project. The Business Model is structured in 2 parts: • The first part is focused on the design of the business model Canvas, to explain the value provided by the Drone4safety project; • The second part aims at defining a detailed financial analysis, with the target of calculating the IRR (Internal Return rate) and the NPV (Net Present Value) of the investment in a 7 years plan (2 years to run the project + 5 years post-implementation). As to the financial analysis 2 different points of view are assumed: • Point of view of the Drones4safety company in charge of designing, producing, and selling the new system; • Point of view of the Utility company that will adopt the new system in its O&M practices; Assuming the point of view of the Drones4safety company 3 scenarios were considered: • Selling the drones > revenues will be produced by the drones’ sales; • Renting the drones > revenues will be produced by the rental of the drones (with a time-based model); • Selling the data acquisition service > revenues will be produced by the sales of pictures acquired by drones; Assuming the point of view of a utility adopting the D4S system, a 4th scenario was analyzed taking into account the decremental costs related to the change of operation and maintenance practices. The paper will show, for both companies, what are the key parameters affecting most of the business model and which are the sustainable scenarios.

Keywords: a swarm of drones, AI, bridges, railways, drones4safety company, utility companies

Procedia PDF Downloads 141
25354 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 370
25353 Study of a Fabry-Perot Resonator

Authors: F. Hadjaj, A. Belghachi, A. Halmaoui, M. Belhadj, H. Mazouz

Abstract:

A laser is essentially an optical oscillator consisting of a resonant cavity, an amplifying medium and a pumping source. In semiconductor diode lasers, the cavity is created by the boundary between the cleaved face of the semiconductor crystal and air and also has reflective properties as a result of the differing refractive indices of the two media. For a GaAs-air interface a reflectance of 0.3 is typical and therefore the length of the semiconductor junction forms the resonant cavity. To prevent light, being emitted in unwanted directions from the junction and Sides perpendicular to the required direction are roughened. The objective of this work is to simulate the optical resonator Fabry-Perot and explore its main characteristics, such as FSR, Finesse, Linewidth, Transmission and so on that describe the performance of resonator.

Keywords: Fabry-Perot Resonator, laser diod, reflectance, semiconductor

Procedia PDF Downloads 352
25352 Morphological Process of Villi Detachment Assessed by Computer-Assisted 3D Reconstruction of Intestinal Crypt from Serial Ultrathin Sections of Rat Duodenum Mucosa

Authors: Lise P. Labéjof, Ivna Mororó, Raquel G. Bastos, Maria Isabel G. Severo, Arno H. de Oliveira

Abstract:

This work presents an alternative mode of intestine mucosa renewal that may allow to better understand the total loss of villi after irradiation. It was tested a morphological method of 3d reconstruction using micrographs of serial sections of rat duodenum. We used hundreds of sections of each specimen of duodenum placed on glass slides and examined under a light microscope. Those containing the detachment, approximately a dozen, were chosen for observation under a transmission electron microscope (TEM). Each of these sections was glued on a block of epon resin and recut into a hundred of 60 nm-thick sections. Ribbons of these ultrathin sections were distributed on a series of copper grids in the same order of appearance than during the process of microstomia. They were then stained by solutions of uranyl and lead salts and observed under a TEM. The sections were pictured and the electron micrographs showing signs of cells detachment were transferred into two softwares, ImageJ to align the cellular structures and Reconstruct to realize the 3d reconstruction. It has been detected epithelial cells that exhibited all signs of programmed cell death and localized at the villus-crypt junction. Their nucleus was irregular in shape with a condensed chromatin in clumps. Their cytoplasm was darker than that of neighboring cells, containing many swollen mitochondria. In some places of the sections, we could see intercellular spaces enlarged by the presence of shrunk cells which displayed a plasma membrane with an irregular shape in thermowell as if the cell interdigitations would distant from each other. The three-dimensional reconstruction of the crypts has allowed observe gradual loss of intercellular contacts of crypt cells in the longitudinal plan of the duodenal mucosa. In the transverse direction, there was a gradual increase of the intercellular space as if these cells moved away from one another. This observation allows assume that the gradual remoteness of the cells at the villus-crypt junction is the beginning of the mucosa detachment. Thus, the shrinking of cells due to apoptosis is the way that they detach from the mucosa and progressively the villi also. These results are in agreement with our initial hypothesis and thus have demonstrated that the villi become detached from the mucosa at the villus-crypt junction by the programmed cell death process. This type of loss of entire villus helps explain the rapid denudation of the intestinal mucosa in case of irradiation.

Keywords: 3dr, transmission electron microscopy, ionizing radiations, rat small intestine, apoptosis

Procedia PDF Downloads 378
25351 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 432
25350 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 254