Search results for: 99.95% IoT data transmission savings
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26845

Search results for: 99.95% IoT data transmission savings

25555 New Analytical Current-Voltage Model for GaN-based Resonant Tunneling Diodes

Authors: Zhuang Guo

Abstract:

In the field of GaN-based resonant tunneling diodes (RTDs) simulations, the traditional Tsu-Esaki formalism failed to predict the values of peak currents and peak voltages in the simulated current-voltage(J-V) characteristics. The main reason is that due to the strong internal polarization fields, two-dimensional electron gas(2DEG) accumulates at emitters, resulting in 2D-2D resonant tunneling currents, which become the dominant parts of the total J-V characteristics. By comparison, based on the 3D-2D resonant tunneling mechanism, the traditional Tsu-Esaki formalism cannot predict the J-V characteristics correctly. To overcome this shortcoming, we develop a new analytical model for the 2D-2D resonant tunneling currents generated in GaN-based RTDs. Compared with Tsu-Esaki formalism, the new model has made the following modifications: Firstly, considering the Heisenberg uncertainty, the new model corrects the expression of the density of states around the 2DEG eigenenergy levels at emitters so that it could predict the half width at half-maximum(HWHM) of resonant tunneling currents; Secondly, taking into account the effect of bias on wave vectors on the collectors, the new model modifies the expression of the transmission coefficients which could help to get the values of peak currents closer to the experiment data compared with Tsu-Esaki formalism. The new analytical model successfully predicts the J-V characteristics of GaN-based RTDs, and it also reveals more detailed mechanisms of resonant tunneling happened in GaN-based RTDs, which helps to design and fabricate high-performance GaN RTDs.

Keywords: GaN-based resonant tunneling diodes, tsu-esaki formalism, 2D-2D resonant tunneling, heisenberg uncertainty

Procedia PDF Downloads 76
25554 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 478
25553 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 303
25552 Assessing Transition to Renewable Energy for Transportation in Indonesia through Drop-in Biofuel Utilization

Authors: Maslan Lamria, Ralph E. H. Sims, Tatang H. Soerawidjaja

Abstract:

In increasing its self-sufficiency on transportation fuel, Indonesia is currently developing commercial production and use of drop-in biofuel (DBF) from vegetable oil. To maximize the level of success, it is necessary to get insights on how the implementation would develop as well as any important factors. This study assessed the dynamics of transition from existing fossil fuel system to a renewable fuel system, which involves the transition from existing biodiesel to projected DBF. A systems dynamics approach was applied and a model developed to simulate the dynamics of liquid biofuel transition. The use of palm oil feedstock was taken as a case study to assess the projected DBF implementation by 2045. The set of model indicators include liquid fuel self-sufficiency, liquid biofuel share, foreign exchange savings and green-house gas emissions reduction. The model outputs showed that supports on DBF investment and use play an important role in the transition progress. Given assumptions which include application of a maximum level of supports over time, liquid fuel self-sufficiency would be still unfulfilled in which palm biofuel contribution is 0.2. Thus, other types of feedstock such as algae and oil feedstock from marginal lands need to be developed synergically. Regarding support on DBF use, this study recommended that removal of fossil subsidy would be necessary prior to applying a carbon tax policy effectively.

Keywords: biofuel, drop-in biofuel, energy transition, liquid fuel

Procedia PDF Downloads 145
25551 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition

Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can

Abstract:

To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.

Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning

Procedia PDF Downloads 85
25550 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 504
25549 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 430
25548 A Horn Antenna Loaded with FSS of Crossed Dipoles

Authors: Ibrahim Mostafa El-Mongy, Abdelmegid Allam

Abstract:

In this article analysis and investigation of the effect of loading a horn antenna with frequency selective surface (FSS) of crossed dipoles of finite size is presented. It is fabricated on Rogers RO4350 (lossy) of relative permittivity 3.33, thickness 1.524 mm and loss tangent 0.004. Basically it is applied for filtering and minimizing the interference and noise in the desired band. The filtration is carried out using a finite FSS of crossed dipoles of overall dimensions 98x58 mm2. The filtration is shown by limiting the transmission bandwidth from 4 GHz (8–12 GHz) to 0.25 GHz (10.75–11 GHz). It is simulated using CST MWS and measured using network analyzer. There is a good agreement between the simulated and measured results.

Keywords: antenna, filtenna, frequency selective surface (FSS), horn

Procedia PDF Downloads 458
25547 Epidemiology of Hepatitis B and Hepatitis C Viruses Among Pregnant Women at Queen Elizabeth Central Hospital, Malawi

Authors: Charles Bijjah Nkhata, Memory Nekati Mvula, Milton Masautso Kalongonda, Martha Masamba, Isaac Thom Shawa

Abstract:

Viral Hepatitis is a serious public health concern globally with deaths estimated at 1.4 million annually due to liver fibrosis, cirrhosis, and hepatocellular carcinoma. Hepatitis B and C are the most common viruses that cause liver damage. However, the majority of infected individuals are unaware of their serostatus. Viral Hepatitis has contributed to maternal and neonatal morbidity and mortality. There is no updated data on the Epidemiology of hepatitis B and C among pregnant mothers in Malawi. To assess the epidemiology of Hepatitis B and C viruses among pregnant women at Queen Elizabeth Central Hospital. Specific Objectives • To determine sero-prevalence of HBsAg and Anti-HCV in pregnant women at QECH. • To investigate risk factors associated with HBV and HCV infection in pregnant women. • To determine the distribution of HBsAg and Anti-HCV infection among pregnant women of different age group. A descriptive cross-sectional study was conducted among pregnant women at QECH in last quarter of 2021. Of the 114 pregnant women, 96 participants were consented and enrolled using a convenient sampling technique. 12 participants were dropped due to various reasons; therefore 84 completed the study. A semi-structured questionnaire was used to collect socio-demographic and behavior characteristics to assess the risk of exposure. Serum was processed from venous blood samples and tested for HBsAg and Anti-HCV markers utilizing Rapid screening assays for screening and Enzyme Linked Immunosorbent Assay for confirmatory. A total of 84 pregnant consenting pregnant women participated in the study, with 1.2% (n=1/84) testing positive for HBsAg and nobody had detectable anti-HCV antibodies. There was no significant link between HBV and HCV in any of the socio-demographic data or putative risk variables. The findings indicate a viral hepatitis prevalence lower than the set range by the WHO. This suggests that HBV and HCV are rare in pregnant women at QECH. Nevertheless, accessible screening for all pregnant women should be provided. The prevention of MTCT is key for reduction and prevention of the global burden of chronic viral Hepatitis.

Keywords: viral hepatitis, hepatitis B, hepatitis C, pregnancy, malawi, liver disease, mother to child transmission

Procedia PDF Downloads 169
25546 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 290
25545 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India

Authors: Anushtha Saxena

Abstract:

This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.

Keywords: data monetization, e-commerce companies, regulatory framework, GDPR

Procedia PDF Downloads 120
25544 Awareness regarding Radiation Protection among the Technicians Practicing in Bharatpur, Chitwan, Nepal

Authors: Jayanti Gyawali, Deepak Adhikari, Mukesh Mallik, Sanjay Sah

Abstract:

Radiation is defined as an emission or transmission of energy in form of waves or particles through space or material medium. The major imaging tools used in diagnostic radiology is based on the use of ionizing radiation. A cross-sectional study was carried out during July- August, 2015 among technicians in 15 different hospitals of Bharatpur, Chitwan, Nepal to assess awareness regarding radiation protection and their current practice. The researcher was directly engaged for data collection using self-administered semi-structured questionnaire. The findings of the study are presented in socio-demographic characteristics of respondents, current practice of respondents and knowledge regarding radiation protection. The result of this study demonstrated that despite the importance of radiation and its consequent hazards, the level of knowledge among technicians is only 60.23% and their current practice is 76.84%. The difference in the mean score of knowledge and practice might have resulted due to technicians’s regular work and lack of updates. The study also revealed that there is no significant (p>0.05) difference in knowledge level of technicians practicing in different hospitals. But the mean difference in practice scores of different hospital is significant (p<0.05) i.e. i.e. the cancer hospital with large volumes of regular radiological cases and radiation therapies for cancer treatment has better practice in comparison to other hospitals. The deficiency in knowledge of technicians might alter the expected benefits, compared to the risk involved, and can cause erroneous medical diagnosis and radiation hazard. Therefore, this study emphasizes the need for all technicians to update themselves with the appropriate knowledge and current practice about ionizing and non-ionizing radiation.

Keywords: technicians, knowledge, Nepal, radiation

Procedia PDF Downloads 331
25543 Cooperative Scheme Using Adjacent Base Stations in Wireless Communication

Authors: Young-Min Ko, Seung-Jun Yu, Chang-Bin Ha, Hyoung-Kyu Song

Abstract:

In a wireless communication system, the failure of base station can result in a communication disruption in the cell. This paper proposes a way to deal with the failure of base station in a wireless communication system based on OFDM. Cooperative communication of the adjacent base stations can be a solution of the problem. High performance is obtained by the configuration of transmission signals which is applied CDD scheme in the cooperative communication. The Cooperative scheme can be a effective solution in case of the particular situation.

Keywords: base station, CDD, OFDM, diversity gain, MIMO

Procedia PDF Downloads 485
25542 The Regulation on Human Exposure to Electromagnetic Fields for Brazilian Power System

Authors: Hugo Manoel Olivera Da Silva, Ricardo Silva Thé Pontes

Abstract:

In this work, is presented an analysis of the Brazilian regulation on human exposure to electromagnetic fields, which provides limits to electric fields, magnetic and electromagnetic fields. The regulations for the electricity sector was in charge of the Agência Nacional de Energia Elétrica-ANEEL, the Brazilian Electricity Regulatory Agency, that made it through the Normative Resolution Nº 398/2010, resulting in a series of obligations for the agents of the electricity sector, especially in the areas of generation, transmission, and distribution.

Keywords: adverse effects, electric energy, electric and magnetic fields, human health, regulation

Procedia PDF Downloads 607
25541 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 199
25540 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 18
25539 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads

Authors: Dražen Cvitanić, Biljana Maljković

Abstract:

This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.

Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency

Procedia PDF Downloads 452
25538 Analyzing the Impact of Migration on HIV and AIDS Incidence Cases in Malaysia

Authors: Ofosuhene O. Apenteng, Noor Azina Ismail

Abstract:

The human immunodeficiency virus (HIV) that causes acquired immune deficiency syndrome (AIDS) remains a global cause of morbidity and mortality. It has caused panic since its emergence. Relationships between migration and HIV/AIDS have become complex. In the absence of prospectively designed studies, dynamic mathematical models that take into account the migration movement which will give very useful information. We have explored the utility of mathematical models in understanding transmission dynamics of HIV and AIDS and in assessing the magnitude of how migration has impact on the disease. The model was calibrated to HIV and AIDS incidence data from Malaysia Ministry of Health from the period of 1986 to 2011 using Bayesian analysis with combination of Markov chain Monte Carlo method (MCMC) approach to estimate the model parameters. From the estimated parameters, the estimated basic reproduction number was 22.5812. The rate at which the susceptible individual moved to HIV compartment has the highest sensitivity value which is more significant as compared to the remaining parameters. Thus, the disease becomes unstable. This is a big concern and not good indicator from the public health point of view since the aim is to stabilize the epidemic at the disease-free equilibrium. However, these results suggest that the government as a policy maker should make further efforts to curb illegal activities performed by migrants. It is shown that our models reflect considerably the dynamic behavior of the HIV/AIDS epidemic in Malaysia and eventually could be used strategically for other countries.

Keywords: epidemic model, reproduction number, HIV, MCMC, parameter estimation

Procedia PDF Downloads 366
25537 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 274
25536 Investigation of Cost Effective Double Layered Slab for γ-Ray Shielding

Authors: Kulwinder Singh Mann, Manmohan Singh Heer, Asha Rani

Abstract:

The safe storage of radioactive materials has become an important issue. Nuclear engineering necessitates the safe handling of radioactive materials emitting high energy gamma-rays. Hazards involved in handling radioactive materials insist suitable shielded enclosures. With overgrowing use of nuclear energy for meeting the increasing demand of power, there is a need to investigate the shielding behavior of cost effective shielded enclosure (CESE) made from clay-bricks (CB) and fire-bricks (FB). In comparison to the lead-bricks (conventional-shielding), the CESE are the preferred choice in nuclear waste management. The objective behind the present investigation is to evaluate the double layered transmission exposure buildup factors (DLEBF) for gamma-rays for CESE in energy range 0.5-3MeV. For necessary computations of shielding parameters, using existing huge data regarding gamma-rays interaction parameters of all periodic table elements, two computer programs (GRIC-toolkit and BUF-toolkit) have been designed. It has been found that two-layered slabs show effective shielding for gamma-rays in orientation CB followed by FB than the reverse. It has been concluded that the arrangement, FB followed by CB reduces the leakage of scattered gamma-rays from the radioactive source.

Keywords: buildup factor, clay bricks, fire bricks, nuclear wastage management, radiation protective double layered slabs

Procedia PDF Downloads 407
25535 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast

Authors: Ruixia Liu

Abstract:

Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.

Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI

Procedia PDF Downloads 234
25534 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies

Authors: Sook Ching Yee, Angela Siew Hoong Lee

Abstract:

Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.

Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)

Procedia PDF Downloads 362
25533 Spin Resolved Electronic Behavior of Zno Nanoribbons

Authors: Serkan Caliskan

Abstract:

The aim of this study is to understand the spin-resolved properties of ZnO armchair and zigzag nanoribbons. The spin polarization can be induced by either geometry of the nanoribbons or ferromagnetic electrodes. Hence, spin-dependent behavior is revealed in these nanostructures in the absence of external magnetic field. Both electronic structure and magnetic properties of the nanoribbons are analyzed, employing first-principles calculations through Density Functional Theory. The relevant properties using the spin-dependent band structure, conductance, transmission, density of states and magnetic moment are elucidated. These results can be utilized to describe the nanoscale structures and stimulate the experimental works.

Keywords: first principles, spin polarized transport, ZnO device, ZnO nanoribbons

Procedia PDF Downloads 194
25532 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 497
25531 Injection Practices among Private Medical Practitioners of Karachi Pakistan

Authors: Mohammad Tahir Yousafzai, Nighat Nisar, Rehana Khalil

Abstract:

The aim of this study is to assess the practices of sharp injuries and factors leading to it among medical practitioners in slum areas of Karachi, Pakistan. A cross sectional study was conducted in slum areas of Landhi Town Karachi. All medical practitioners (317) running the private clinics in the areas were asked to participate in the study. Data was collected on self administered pre-tested structured questionnaires. The frequency with percentage and 95% confidence interval was calculated for at least one sharp injury (SI) in the last one year. The factors leading to sharp injuries were assessed using multiple logistic regressions. About 80% of private medical practitioners consented to participate. Among these 87% were males and 13% were female. The mean age was 38±11 years and mean work experience was 12±9 years. The frequency of at least one sharp injury in the last one year was 27%(95% CI: 22.2-32). Almost 47% of Sharp Injuries were caused by needle recapping, less work experience, less than 14 years of schooling, more than 20 patients per day, administering more than 30 injections per day, reuse of syringes and needle recapping after use were significantly associated with sharp injuries. Injection practices were found inadequate among private medical practitioners in slum areas of Karachi, and the frequency of Sharp Injuries was found high in these areas. There is a risk of occupational transmission of blood borne infections among medical practitioners warranting an urgent need for launching awareness and training on standard precautions for private medical practitioners in the slum areas of Karachi.

Keywords: injection practices, private practitioners, sharp injuries, blood borne infections

Procedia PDF Downloads 421
25530 Design of Circular Patch Antenna in Terahertz Band for Medical Applications

Authors: Moulfi Bouchra, Ferouani Souheyla, Ziani Kerarti Djalal, Moulessehoul Wassila

Abstract:

The wireless body network (WBAN) is the most interesting network these days and especially with the appearance of contagious illnesses such as covid 19, which require surveillance in the house. In this article, we have designed a circular microstrip antenna. Gold is the material used respectively for the patch and the ground plane and Gallium (εr=12.94) is chosen as the dielectric substrate. The dimensions of the antenna are 82.10*62.84 μm2 operating at a frequency of 3.85 THz. The proposed, designed antenna has a return loss of -46.046 dB and a gain of 3.74 dBi, and it can measure various physiological parameters and sensors that help in the overall monitoring of an individual's health condition.

Keywords: circular patch antenna, Terahertz transmission, WBAN applications, real-time monitoring

Procedia PDF Downloads 307
25529 Mathematical Reconstruction of an Object Image Using X-Ray Interferometric Fourier Holography Method

Authors: M. K. Balyan

Abstract:

The main principles of X-ray Fourier interferometric holography method are discussed. The object image is reconstructed by the mathematical method of Fourier transformation. The three methods are presented – method of approximation, iteration method and step by step method. As an example the complex amplitude transmission coefficient reconstruction of a beryllium wire is considered. The results reconstructed by three presented methods are compared. The best results are obtained by means of step by step method.

Keywords: dynamical diffraction, hologram, object image, X-ray holography

Procedia PDF Downloads 394
25528 The Effect of Six-Weeks of Elastic Exercises with Reactionary Ropes on Nerve Conduction Velocity and Balance in Females with Multiple Sclerosis

Authors: Mostafa Sarabzadeh, Masoumeh Helalizadeh, Seyyed Mahmoud Hejazi

Abstract:

Multiple Sclerosis is considered as diseases related to central nerve system, the chronic and progressive disease impress on sensory and motor function of people. Due to equilibrium problems in this patients that related to disorder of nerve conduction transmission from central nerve system to organs and the nature of elastic bands that can make changes in neuromuscular junctions and momentary actions, the aim of this research is evaluate elastic training effect by reactionary ropes on nerve conduction velocity (in lower and upper limb) and functional balance in female patients with Multiple Sclerosis. The study was a semi-experimental study that was performed based on pre and post-test method, The statistical community consisted of 16 women with MS in the age mean 25-40yrs, at low and intermediate levels of disease EDSS 1-4 (Expanded Disability Status Scale) that were divided randomly into elastic and control groups, so the training program of experimental group lasted six weeks, 3 sessions per week of elastic exercises with reactionary ropes. Electroneurography parameters (nerve conduction velocity- latency) of Upper and lower nerves (Median, Tibial, Sural, Peroneal) along with balance were investigated respectively by the Electroneurography system (ENG) and Timed up and go (TUG) functional test two times in before and after the training period. After that, To analyze the data were used of Dependent and Independent T-test (with sig level p<0.05). The results showed significant increase in nerve conduction velocity of Sural (p=0.001), Peroneal (p=0.01), Median (p=0.03) except Tibial and also development Latency Time of Tibial (p= 0), Peroneal (p=0), Median (p=0) except Sural. The TUG test showed significant decreases in execution time too (p=0.001). Generally, based on what the obtained data can indicate, modern training with elastic bands can contribute to enhanced nerve conduction velocity and balance in neurosis patients (MS) so lead to reduce problems, promotion of mobility and finally more life expectancy in these patients.

Keywords: balance, elastic bands, multiple sclerosis, nerve conduction, velocity

Procedia PDF Downloads 216
25527 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 405
25526 A Study of Blockchain Oracles

Authors: Abdeljalil Beniiche

Abstract:

The limitation with smart contracts is that they cannot access external data that might be required to control the execution of business logic. Oracles can be used to provide external data to smart contracts. An oracle is an interface that delivers data from external data outside the blockchain to a smart contract to consume. Oracle can deliver different types of data depending on the industry and requirements. In this paper, we study and describe the widely used blockchain oracles. Then, we elaborate on his potential role, technical architecture, and design patterns. Finally, we discuss the human oracle and its key role in solving the truth problem by reaching a consensus about a certain inquiry and tasks.

Keywords: blockchain, oracles, oracles design, human oracles

Procedia PDF Downloads 136