Search results for: 99.95% IoT data transmission savings
22069 Accelerating Side Channel Analysis with Distributed and Parallelized Processing
Authors: Kyunghee Oh, Dooho Choi
Abstract:
Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.Keywords: DPA, distributed computing, parallelized processing, side channel analysis
Procedia PDF Downloads 42822068 Monitoring of Hydrological Parameters in the Alexandra Jukskei Catchment in South Africa
Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera
Abstract:
It has been noted that technical programming for handling groundwater resources is not accessible. The lack of these systems hinders groundwater management processes necessary for decision-making through monitoring and evaluation regarding the Jukskei River of the Crocodile River (West) Basin in Johannesburg, South Africa. Several challenges have been identified in South Africa's Jukskei Catchment concerning groundwater management. Some of those challenges will include the following: Gaps in data records; there is a need for training and equipping of monitoring staff; formal accreditation of monitoring capacities and equipment; there is no access to regulation terms (e.g., meters). Taking into consideration necessities and human requirements as per typical densities in various regions of South Africa, there is a need to construct several groundwater level monitoring stations in a particular segment; the available raw data on groundwater level should be converted into consumable products for example, short reports on delicate areas (e.g., Dolomite compartments, wetlands, aquifers, and sole source) and considering the increasing civil unrest there has been vandalism and theft of groundwater monitoring infrastructure. GIS was employed at the catchment level to plot the relationship between those identified groundwater parameters in the catchment area and the identified borehole. GIS-based maps were designed for groundwater monitoring to be pretested on one borehole in the Jukskei catchment. This data will be used to establish changes in the borehole compared to changes in the catchment area according to identified parameters.Keywords: GIS, monitoring, Jukskei, catchment
Procedia PDF Downloads 9422067 Transportation Mode Classification Using GPS Coordinates and Recurrent Neural Networks
Authors: Taylor Kolody, Farkhund Iqbal, Rabia Batool, Benjamin Fung, Mohammed Hussaeni, Saiqa Aleem
Abstract:
The rising threat of climate change has led to an increase in public awareness and care about our collective and individual environmental impact. A key component of this impact is our use of cars and other polluting forms of transportation, but it is often difficult for an individual to know how severe this impact is. While there are applications that offer this feedback, they require manual entry of what transportation mode was used for a given trip, which can be burdensome. In order to alleviate this shortcoming, a data from the 2016 TRIPlab datasets has been used to train a variety of machine learning models to automatically recognize the mode of transportation. The accuracy of 89.6% is achieved using single deep neural network model with Gated Recurrent Unit (GRU) architecture applied directly to trip data points over 4 primary classes, namely walking, public transit, car, and bike. These results are comparable in accuracy to results achieved by others using ensemble methods and require far less computation when classifying new trips. The lack of trip context data, e.g., bus routes, bike paths, etc., and the need for only a single set of weights make this an appropriate methodology for applications hoping to reach a broad demographic and have responsive feedback.Keywords: classification, gated recurrent unit, recurrent neural network, transportation
Procedia PDF Downloads 13722066 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design
Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin
Abstract:
In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.Keywords: consumers decision making, product design, rough set theory, user experience
Procedia PDF Downloads 31322065 Audit of TPS photon beam dataset for small field output factors using OSLDs against RPC standard dataset
Authors: Asad Yousuf
Abstract:
Purpose: The aim of the present study was to audit treatment planning system beam dataset for small field output factors against standard dataset produced by radiological physics center (RPC) from a multicenter study. Such data are crucial for validity of special techniques, i.e., IMRT or stereotactic radiosurgery. Materials/Method: In this study, multiple small field size output factor datasets were measured and calculated for 6 to 18 MV x-ray beams using the RPC recommend methods. These beam datasets were measured at 10 cm depth for 10 × 10 cm2 to 2 × 2 cm2 field sizes, defined by collimator jaws at 100 cm. The measurements were made with a Landauer’s nanoDot OSLDs whose volume is small enough to gather a full ionization reading even for the 1×1 cm2 field size. At our institute the beam data including output factors have been commissioned at 5 cm depth with an SAD setup. For comparison with the RPC data, the output factors were converted to an SSD setup using tissue phantom ratios. SSD setup also enables coverage of the ion chamber in 2×2 cm2 field size. The measured output factors were also compared with those calculated by Eclipse™ treatment planning software. Result: The measured and calculated output factors are in agreement with RPC dataset within 1% and 4% respectively. The large discrepancies in TPS reflect the increased challenge in converting measured data into a commissioned beam model for very small fields. Conclusion: OSLDs are simple, durable, and accurate tool to verify doses that delivered using small photon beam fields down to a 1x1 cm2 field sizes. The study emphasizes that the treatment planning system should always be evaluated for small field out factors for the accurate dose delivery in clinical setting.Keywords: small field dosimetry, optically stimulated luminescence, audit treatment, radiological physics center
Procedia PDF Downloads 32722064 Nonlinear Multivariable Analysis of CO2 Emissions in China
Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu
Abstract:
This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis
Procedia PDF Downloads 40322063 Estimation of Longitudinal Dispersion Coefficient Using Tracer Data
Authors: K. Ebrahimi, Sh. Shahid, M. Mohammadi Ghaleni, M. H. Omid
Abstract:
The longitudinal dispersion coefficient is a crucial parameter for 1-D water quality analysis of riverine flows. So far, different types of empirical equations for estimation of the coefficient have been developed, based on various case studies. The main objective of this paper is to develop an empirical equation for estimation of the coefficient for a riverine flow. For this purpose, a set of tracer experiments was conducted, involving salt tracer, at three sections located in downstream of a lengthy canal. Tracer data were measured in three mixing lengths along the canal including; 45, 75 and 100m. According to the results, the obtained coefficients from new developed empirical equation gave an encouraging level of agreement with the theoretical values.Keywords: coefficients, dispersion, river, tracer, water quality
Procedia PDF Downloads 38922062 Game-Based Learning in a Higher Education Course: A Case Study with Minecraft Education Edition
Authors: Salvador Antelmo Casanova Valencia
Abstract:
This study documents the use of the Minecraft Education Edition application to explore immersive game-based learning environments. We analyze the contributions of fourth-year university students who are pursuing a degree in Administrative Computing at the Universidad Michoacana de San Nicolas de Hidalgo. In this study, descriptive data and statistical inference are detailed using a quasi-experimental design using the Wilcoxon test. The instruments will provide data validation. Game-based learning in immersive environments necessarily implies greater student participation and commitment, resulting in the study, motivation, and significant improvements, promoting cooperation and autonomous learning.Keywords: game-based learning, gamification, higher education, Minecraft
Procedia PDF Downloads 16322061 Determining the Direction of Causality between Creating Innovation and Technology Market
Authors: Liubov Evstigneeva
Abstract:
In this paper an attempt is made to establish causal nexuses between innovation and international trade in Russia. The topicality of this issue is determined by the necessity of choosing policy instruments for economic modernization and transition to innovative development. The vector auto regression (VAR) model and Granger test are applied for the Russian monthly data from 2005 until the second quartile of 2015. Both lagged import and export at the national level cause innovation, the latter starts to stimulate foreign trade since it is a remote lag. In comparison to aggregate data, the results by patent’s categories are more diverse. Importing technologies from foreign countries stimulates patent activity, while innovations created in Russia are only Granger causality for import to Commonwealth of Independent States.Keywords: export, import, innovation, patents
Procedia PDF Downloads 32122060 Using Inverted 4-D Seismic and Well Data to Characterise Reservoirs from Central Swamp Oil Field, Niger Delta
Authors: Emmanuel O. Ezim, Idowu A. Olayinka, Michael Oladunjoye, Izuchukwu I. Obiadi
Abstract:
Monitoring of reservoir properties prior to well placements and production is a requirement for optimisation and efficient oil and gas production. This is usually done using well log analyses and 3-D seismic, which are often prone to errors. However, 4-D (Time-lapse) seismic, incorporating numerous 3-D seismic surveys of the same field with the same acquisition parameters, which portrays the transient changes in the reservoir due to production effects over time, could be utilised because it generates better resolution. There is, however dearth of information on the applicability of this approach in the Niger Delta. This study was therefore designed to apply 4-D seismic, well-log and geologic data in monitoring of reservoirs in the EK field of the Niger Delta. It aimed at locating bypassed accumulations and ensuring effective reservoir management. The Field (EK) covers an area of about 1200km2 belonging to the early (18ma) Miocene. Data covering two 4-D vintages acquired over a fifteen-year interval were obtained from oil companies operating in the field. The data were analysed to determine the seismic structures, horizons, Well-to-Seismic Tie (WST), and wavelets. Well, logs and production history data from fifteen selected wells were also collected from the Oil companies. Formation evaluation, petrophysical analysis and inversion alongside geological data were undertaken using Petrel, Shell-nDi, Techlog and Jason Software. Well-to-seismic tie, formation evaluation and saturation monitoring using petrophysical and geological data and software were used to find bypassed hydrocarbon prospects. The seismic vintages were interpreted, and the amounts of change in the reservoir were defined by the differences in Acoustic Impedance (AI) inversions of the base and the monitor seismic. AI rock properties were estimated from all the seismic amplitudes using controlled sparse-spike inversion. The estimated rock properties were used to produce AI maps. The structural analysis showed the dominance of NW-SE trending rollover collapsed-crest anticlines in EK with hydrocarbons trapped northwards. There were good ties in wells EK 27, 39. Analysed wavelets revealed consistent amplitude and phase for the WST; hence, a good match between the inverted impedance and the good data. Evidence of large pay thickness, ranging from 2875ms (11420 TVDSS-ft) to about 2965ms, were found around EK 39 well with good yield properties. The comparison between the base of the AI and the current monitor and the generated AI maps revealed zones of untapped hydrocarbons as well as assisted in determining fluids movement. The inverted sections through EK 27, 39 (within 3101 m - 3695 m), indicated depletion in the reservoirs. The extent of the present non-uniform gas-oil contact and oil-water contact movements were from 3554 to 3575 m. The 4-D seismic approach led to better reservoir characterization, well development and the location of deeper and bypassed hydrocarbon reservoirs.Keywords: reservoir monitoring, 4-D seismic, well placements, petrophysical analysis, Niger delta basin
Procedia PDF Downloads 11622059 Attributes That Influence Respondents When Choosing a Mate in Internet Dating Sites: An Innovative Matching Algorithm
Authors: Moti Zwilling, Srečko Natek
Abstract:
This paper aims to present an innovative predictive analytics analysis in order to find the best combination between two consumers who strive to find their partner or in internet sites. The methodology shown in this paper is based on analysis of consumer preferences and involves data mining and machine learning search techniques. The study is composed of two parts: The first part examines by means of descriptive statistics the correlations between a set of parameters that are taken between man and women where they intent to meet each other through the social media, usually the internet. In this part several hypotheses were examined and statistical analysis were taken place. Results show that there is a strong correlation between the affiliated attributes of man and woman as long as concerned to how they present themselves in a social media such as "Facebook". One interesting issue is the strong desire to develop a serious relationship between most of the respondents. In the second part, the authors used common data mining algorithms to search and classify the most important and effective attributes that affect the response rate of the other side. Results exhibit that personal presentation and education background are found as most affective to achieve a positive attitude to one's profile from the other mate.Keywords: dating sites, social networks, machine learning, decision trees, data mining
Procedia PDF Downloads 29322058 Analysis of Cardiovascular Diseases Using Artificial Neural Network
Authors: Jyotismita Talukdar
Abstract:
In this paper, a study has been made on the possibility and accuracy of early prediction of several Heart Disease using Artificial Neural Network. (ANN). The study has been made in both noise free environment and noisy environment. The data collected for this analysis are from five Hospitals. Around 1500 heart patient’s data has been collected and studied. The data is analysed and the results have been compared with the Doctor’s diagnosis. It is found that, in noise free environment, the accuracy varies from 74% to 92%and in noisy environment (2dB), the results of accuracy varies from 62% to 82%. In the present study, four basic attributes considered are Blood Pressure (BP), Fasting Blood Sugar (FBS), Thalach (THAL) and Cholesterol (CHOL.). It has been found that highest accuracy(93%), has been achieved in case of PPI( Post-Permanent-Pacemaker Implementation ), around 79% in case of CAD(Coronary Artery disease), 87% in DCM (Dilated Cardiomyopathy), 89% in case of RHD&MS(Rheumatic heart disease with Mitral Stenosis), 75 % in case of RBBB +LAFB (Right Bundle Branch Block + Left Anterior Fascicular Block), 72% for CHB(Complete Heart Block) etc. The lowest accuracy has been obtained in case of ICMP (Ischemic Cardiomyopathy), about 38% and AF( Atrial Fibrillation), about 60 to 62%.Keywords: coronary heart disease, chronic stable angina, sick sinus syndrome, cardiovascular disease, cholesterol, Thalach
Procedia PDF Downloads 17422057 Damage Detection in a Cantilever Beam under Different Excitation and Temperature Conditions
Authors: A. Kyprianou, A. Tjirkallis
Abstract:
Condition monitoring of structures in service is very important as it provides information about the risk of damage development. One of the essential constituents of structural condition monitoring is the damage detection methodology. In the context of condition monitoring of in service structures a damage detection methodology analyses data obtained from the structure while it is in operation. Usually, this means that the data could be affected by operational and environmental conditions in a way that could mask the effects of a possible damage on the data. This, depending on the damage detection methodology, could lead to either false alarms or miss existing damages. In this article a damage detection methodology that is based on the Spatio-temporal continuous wavelet transform (SPT-CWT) analysis of a sequence of experimental time responses of a cantilever beam is proposed. The cantilever is subjected to white and pink noise excitation to simulate different operating conditions. In addition, in order to simulate changing environmental conditions, the cantilever is subjected to heating by a heat gun. The response of the cantilever beam is measured by a high-speed camera. Edges are extracted from the series of images of the beam response captured by the camera. Subsequent processing of the edges gives a series of time responses on 439 points on the beam. This sequence is then analyzed using the SPT-CWT to identify damage. The algorithm proposed was able to clearly identify damage under any condition when the structure was excited by white noise force. In addition, in the case of white noise excitation, the analysis could also reveal the position of the heat gun when it was used to heat the structure. The analysis could identify the different operating conditions i.e. between responses due to white noise excitation and responses due to pink noise excitation. During the pink noise excitation whereas damage and changing temperature were identified it was not possible to clearly identify the effect of damage from that of temperature. The methodology proposed in this article for damage detection enables the separation the damage effect from that due to temperature and excitation on data obtained from measurements of a cantilever beam. This methodology does not require information about the apriori state of the structure.Keywords: spatiotemporal continuous wavelet transform, damage detection, data normalization, varying temperature
Procedia PDF Downloads 27922056 By-Line Analysis of Determinants Insurance Premiums : Evidence from Tunisian Market
Authors: Nadia Sghaier
Abstract:
In this paper, we aim to identify the determinants of the life and non-life insurance premiums of different lines for the case of the Tunisian insurance market over a recent period from 1997 to 2019. The empirical analysis is conducted using the linear cointegration techniques in the panel data framework, which allow both long and short-run relationships. The obtained results show evidence of long-run relationship between premiums, losses, and financial variables (stock market indices and interest rate). Furthermore, we find that the short-run effect of explanatory variables differs across lines. This finding has important implications for insurance tarification and regulation.Keywords: insurance premiums, lines, Tunisian insurance market, cointegration approach in panel data
Procedia PDF Downloads 19822055 Development of a Methodology for Surgery Planning and Control: A Management Approach to Handle the Conflict of High Utilization and Low Overtime
Authors: Timo Miebach, Kirsten Hoeper, Carolin Felix
Abstract:
In times of competitive pressures and demographic change, hospitals have to reconsider their strategies as a company. Due to the fact, that operations are one of the main income and one of the primary cost drivers otherwise, a process-oriented approach and an efficient use of resources seems to be the right way for getting a consistent market position. Thus, the efficient operation room occupancy planning is an important cause variable for the success and continued the existence of these institutions. A high utilization of resources is essential. This means a very high, but nevertheless sensible capacity-oriented utilization of working systems that can be realized by avoiding downtimes and a thoughtful occupancy planning. This engineering approach should help hospitals to reach her break-even point. Firstly, the aim is to establish a strategy point, which can be used for the generation of a planned throughput time. Secondly, the operation planning and control should be facilitated and implemented accurately by the generation of time modules. More than 100,000 data records of the Hannover Medical School were analyzed. The data records contain information about the type of conducted operation, the duration of the individual process steps, and all other organizational-specific data such as an operating room. Based on the aforementioned data base, a generally valid model was developed by an analysis to define a strategy point which takes the conflict of capacity utilization and low overtime into account. Furthermore, time modules were generated in this work, which allows a simplified and flexible operation planning and control for the operation manager. By the time modules, it is possible to reduce a high average value of the idle times of the operation rooms. Furthermore, the potential is used to minimize the idle time spread.Keywords: capacity, operating room, surgery planning and control, utilization
Procedia PDF Downloads 25222054 The Effect of Knowledge Management in Lean Organization
Authors: Mehrnoosh Askarizadeh
Abstract:
In an ever changeable and globalized world with new economic and global competitors competing for the same customers and resources, is increasing the pressure on organizations' competitiveness. In addition, organizations faces additional challenges due to an ever-growing amount of data and the ever-bigger challenge of analyzing that data and keeping the data secure. Successful companies are characterized by exploiting their intellectual capital in an efficient manner. Thus, the most valuable asset an organization has today has become its employees' knowledge. To enable this, there is a tool that supports easier handling and optimizes the use of knowledge, which is knowledge management. Based on the theoretical framework and careful review as well as analysis of interviews and observations resulted in six essential areas: structure, management, compensation, communication, trust and motivation. The analysis showed that the scientific articles and literature have different perspectives, different definitions and are based on different theories but the essence is that they all finally seems to arrive at the same result and conclusion, although with different viewpoints and perspectives. This is regardless of whether the focus is on management style, rewards or communication they all focus on the individual. The conclusion is that organizational culture affects knowledge management and dissemination of information, because of its direct impact on the individual. The largest and most important underlying factor why we choose to participate in improvement work or share knowledge is our motivation. Motivation is the reason for and the reason behind our actions.Keywords: lean, lean production, knowledge management, information management, motivation
Procedia PDF Downloads 51922053 Ficus Microcarpa Fruit Derived Iron Oxide Nanomaterials and Its Anti-bacterial, Antioxidant and Anticancer Efficacy
Authors: Fuad Abdullah Alatawi
Abstract:
Microbial infections-based diseases are a significant public health issue around the world, mainly when antibiotic-resistant bacterium types evolve. In this research, we explored the anti-bacterial and anti-cancer potency of iron-oxide (Fe₂O₃) nanoparticles prepared from F. macrocarpa fruit extract. The chemical composition of F. macrocarpa fruit extract was used as a reducing and capping agent for nanoparticles’ synthesis was examined by GC-MS/MS analysis. Then, the prepared nanoparticles were confirmed by various biophysical techniques, including X-ray powder diffraction (XRD), Fourier-transform infrared spectroscopy (FTIR), UV-Vis Spectroscopy, and Transmission Electron Microscopy (TEM) and Energy Dispersive Spectroscopy (EDAX), and Dynamic Light Scattering (DLS). Also, the antioxidant capacity of fruit extract was determined through 2,2-diphenyl-1-picrylhydrazyl (DPPH), 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid (ABTS), Fluorescence Recovery After Photobleaching (FRAP), Superoxide Dismutase (SOD) assays. Furthermore, the cytotoxicity activities of Fe₂O₃ NPs were determined using the (3-(4, 5-dimethylthiazolyl-2)-2, 5-diphenyltetrazolium bromide) (MTT) test on MCF-7 cells. In the antibacterial assay, lethal doses of the Fe₂O₃NPs effectively inhibited the growth of gram-negative and gram-positive bacteria. The surface damage, ROS production, and protein leakage are the antibacterial mechanisms of Fe₂O₃NPs. Concerning antioxidant activity, the fruit extracts of F. macrocarpa had strong antioxidant properties, which were confirmed by DPPH, ABTS, FRAP, and SOD assays. In addition, the F. microcarpa-derived iron oxide nanomaterials greatly reduced the cell viability of (MCF-7). The GC-MS/MS analysis revealed the presence of 25 main bioactive compounds in the F. microcarpa extract. Overall, the finding of this research revealed that F. microcarpa-derived Fe₂O₃ nanoparticles could be employed as an alternative therapeutic agent to cure microbial infection and breast cancer in humans.Keywords: ficus microcarpa, iron oxide, antibacterial activity, cytotoxicity
Procedia PDF Downloads 12122052 Preparation of Indium Tin Oxide Nanoparticle-Modified 3-Aminopropyltrimethoxysilane-Functionalized Indium Tin Oxide Electrode for Electrochemical Sulfide Detection
Authors: Md. Abdul Aziz
Abstract:
Sulfide ion is water soluble, highly corrosive, toxic and harmful to the human beings. As a result, knowing the exact concentration of sulfide in water is very important. However, the existing detection and quantification methods have several shortcomings, such as high cost, low sensitivity, and massive instrumentation. Consequently, the development of novel sulfide sensor is relevant. Nevertheless, electrochemical methods gained enormous popularity due to a vast improvement in the technique and instrumentation, portability, low cost, rapid analysis and simplicity of design. Successful field application of electrochemical devices still requires vast improvement, which depends on the physical, chemical and electrochemical aspects of the working electrode. The working electrode made of bulk gold (Au) and platinum (Pt) are quite common, being very robust and endowed with good electrocatalytic properties. High cost, and electrode poisoning, however, have so far hindered their practical application in many industries. To overcome these obstacles, we developed a sulfide sensor based on an indium tin oxide nanoparticle (ITONP)-modified ITO electrode. To prepare ITONP-modified ITO, various methods were tested. Drop-drying of ITONPs (aq.) on aminopropyltrimethoxysilane-functionalized ITO (APTMS/ITO) was found to be the best method on the basis of voltammetric analysis of the sulfide ion. ITONP-modified APTMS/ITO (ITONP/APTMS/ITO) yielded much better electrocatalytic properties toward sulfide electro-οxidation than did bare or APTMS/ITO electrodes. The ITONPs and ITONP-modified ITO were also characterized using transmission electron microscopy and field emission scanning electron microscopy, respectively. Optimization of the type of inert electrolyte and pH yielded an ITONP/APTMS/ITO detector whose amperometrically and chronocoulοmetrically determined limits of detection for sulfide in aqueous solution were 3.0 µM and 0.90 µM, respectively. ITONP/APTMS/ITO electrodes which displayed reproducible performances were highly stable and were not susceptible to interference by common contaminants. Thus, the developed electrode can be considered as a promising tool for sensing sulfide.Keywords: amperometry, chronocoulometry, electrocatalytic properties, ITO-nanoparticle-modified ITO, sulfide sensor
Procedia PDF Downloads 13122051 A Study on Method for Identifying Capacity Factor Declination of Wind Turbines
Authors: Dongheon Shin, Kyungnam Ko, Jongchul Huh
Abstract:
The investigation on wind turbine degradation was carried out using the nacelle wind data. The three Vestas V80-2MW wind turbines of Sungsan wind farm in Jeju Island, South Korea were selected for this work. The SCADA data of the wind farm for five years were analyzed to draw power curve of the turbines. It is assumed that the wind distribution is the Rayleigh distribution to calculate the normalized capacity factor based on the drawn power curve of the three wind turbines for each year. The result showed that the reduction of power output from the three wind turbines occurred every year and the normalized capacity factor decreased to 0.12%/year on average.Keywords: wind energy, power curve, capacity factor, annual energy production
Procedia PDF Downloads 43322050 Water Quality Calculation and Management System
Authors: H. M. B. N Jayasinghe
Abstract:
The water is found almost everywhere on Earth. Water resources contain a lot of pollution. Some diseases can be spread through the water to the living beings. So to be clean water it should undergo a number of treatments necessary to make it drinkable. So it is must to have purification technology for the wastewater. So the waste water treatment plants act a major role in these issues. When considering the procedures taken after the water treatment process was always based on manual calculations and recordings. Water purification plants may interact with lots of manual processes. It means the process taking much time consuming. So the final evaluation and chemical, biological treatment process get delayed. So to prevent those types of drawbacks there are some computerized programmable calculation and analytical techniques going to be introduced to the laboratory staff. To solve this problem automated system will be a solution in which guarantees the rational selection. A decision support system is a way to model data and make quality decisions based upon it. It is widely used in the world for the various kind of process automation. Decision support systems that just collect data and organize it effectively are usually called passive models where they do not suggest a specific decision but only reveal information. This web base system is based on global positioning data adding facility with map location. Most worth feature is SMS and E-mail alert service to inform the appropriate person on a critical issue. The technological influence to the system is HTML, MySQL, PHP, and some other web developing technologies. Current issues in the computerized water chemistry analysis are not much deep in progress. For an example the swimming pool water quality calculator. The validity of the system has been verified by test running and comparison with an existing plant data. Automated system will make the life easier in productively and qualitatively.Keywords: automated system, wastewater, purification technology, map location
Procedia PDF Downloads 24722049 Zinc Oxide Nanoparticle-Doped Poly (8-Anilino-1-Napthalene Sulphonic Acid/Nat Nanobiosensors for TB Drugs
Authors: Rachel Fanelwa Ajayi, Anovuyo Jonnas, Emmanuel I. Iwuoha
Abstract:
Tuberculosis (TB) is an infectious disease caused by the bacterium (Mycobacterium tuberculosis) which has a predilection for lung tissue due to its rich oxygen supply. The mycobacterial cell has a unique innate characteristic which allows it to resist human immune systems and drug treatments; hence, it is one of the most difficult of all bacterial infections to treat, let alone to cure. At the same time, multi-drug resistance TB (MDR-TB) caused by poorly managed TB treatment, is a growing problem and requires the administration of expensive and less effective second line drugs which take much longer treatment duration than fist line drugs. Therefore, to acknowledge the issues of patients falling ill as a result of inappropriate dosing of treatment and inadequate treatment administration, a device with a fast response time coupled with enhanced performance and increased sensitivity is essential. This study involved the synthesis of electroactive platforms for application in the development of nano-biosensors suitable for the appropriate dosing of clinically diagnosed patients by promptly quantifying the levels of the TB drug; Isonaizid. These nano-biosensors systems were developed on gold surfaces using the enzyme N-acetyletransferase 2 coupled to the cysteamine modified poly(8-anilino-1-napthalene sulphonic acid)/zinc oxide nanocomposites. The morphology of ZnO nanoparticles, PANSA/ZnO nano-composite and nano-biosensors platforms were characterized using High-Resolution Transmission Electron Microscopy (HRTEM) and High-Resolution Scanning Electron Microscopy (HRSEM). On the other hand, the elemental composition of the developed nanocomposites and nano-biosensors were studied using Fourier Transform Infra-Red Spectroscopy (FTIR) and Energy Dispersive X-Ray (EDX). The electrochemical studies showed an increase in electron conductivity for the PANSA/ZnO nanocomposite which was an indication that it was suitable as a platform towards biosensor development.Keywords: N-acetyletransferase 2, isonaizid, tuberculosis, zinc oxide
Procedia PDF Downloads 37322048 Analyzing the Effectiveness of a Bank of Parallel Resistors, as a Burden Compensation Technique for Current Transformer's Burden, Using LabVIEW™ Data Acquisition Tool
Authors: Dilson Subedi
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However, due to upgradation of electromechanical relays to numerical relays and electromechanical energy meters to digital meters, the connected burden, which defines some of the CT characteristics, has drastically reduced. This has led to the system experiencing high currents damaging the connected relays and meters. Since the protection and metering equipment's are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore, during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and on the reliability of the protection and metering system. This paper shows the effectiveness of a bank of parallel connected resistors, as a burden compensation technique, in compensating the burden of under-burdened CT’s. The response of the CT in the case of failure of one or more resistors at different levels of overcurrent will be captured using the LabVIEWTM data acquisition hardware (DAQ). The analysis is done on the real-time data gathered using LabVIEWTM. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: accuracy limiting factor, burden, burden compensation, current transformer
Procedia PDF Downloads 24522047 Synthesis and Characterization of Sulfonated Aromatic Hydrocarbon Polymers Containing Trifluoromethylphenyl Side Chain for Proton Exchange Membrane Fuel Cell
Authors: Yi-Chiang Huang, Hsu-Feng Lee, Yu-Chao Tseng, Wen-Yao Huang
Abstract:
Proton exchange membranes as a key component in fuel cells have been widely studying over the past few decades. As proton exchange, membranes should have some main characteristics, such as good mechanical properties, low oxidative stability and high proton conductivity. In this work, trifluoromethyl groups had been introduced on polymer backbone and phenyl side chain which can provide densely located sulfonic acid group substitution and also promotes solubility, thermal and oxidative stability. Herein, a series of novel sulfonated aromatic hydrocarbon polyelectrolytes was synthesized by polycondensation of 4,4''''-difluoro-3,3''''- bis(trifluoromethyl)-2'',3''-bis(3-(trifluoromethyl)phenyl)-1,1':4',1'':4'',1''':4''',1''''-quinquephenyl with 2'',3''',5'',6''-tetraphenyl-[1,1':4',1'': 4'',1''':4''',1''''-quinquephenyl]-4,4''''-diol and post-sulfonated was through chlorosulfonic acid to given sulfonated polymers (SFC3-X) possessing ion exchange capacities ranging from 1.93, 1.91 and 2.53 mmol/g. ¹H NMR and FT-IR spectroscopy were applied to confirm the structure and composition of sulfonated polymers. The membranes exhibited considerably dimension stability (10-27.8% in length change; 24-56.5% in thickness change) and excellent oxidative stability (weight remain higher than 97%). The mechanical properties of membranes demonstrated good tensile strength on account of the high rigidity multi-phenylated backbone. Young's modulus were ranged 0.65-0.77GPa which is much larger than that of Nafion 211 (0.10GPa). Proton conductivities of membranes ranged from 130 to 240 mS/cm at 80 °C under fully humidified which were comparable or higher than that of Nafion 211 (150 mS/cm). The morphology of membranes was investigated by transmission electron microscopy which demonstrated a clear hydrophilic/hydrophobic phase separation with spherical ionic clusters in the size range of 5-20 nm. The SFC3-1.97 single fuel cell performance demonstrates the maximum power density at 1.08W/cm², and Nafion 211 was 1.24W/cm² as a reference in this work. The result indicated that SFC3-X are good candidates for proton exchange membranes in fuel cell applications. Fuel cell of other membranes is under testing.Keywords: fuel cells, polyelectrolyte, proton exchange membrane, sulfonated polymers
Procedia PDF Downloads 45322046 Exploring Ways Early Childhood Teachers Integrate Information and Communication Technologies into Children's Play: Two Case Studies from the Australian Context
Authors: Caroline Labib
Abstract:
This paper reports on a qualitative study exploring the approaches teachers used to integrate computers or smart tablets into their program planning. Their aim was to integrate ICT into children’s play, thereby supporting children’s learning and development. Data was collected in preschool settings in Melbourne in 2016. Interviews with teachers, observations of teacher interactions with children and copies of teachers’ planning and observation documents informed the study. The paper looks closely at findings from two early childhood settings and focuses on exploring the differing approaches two EC teachers have adopted when integrating iPad or computers into their settings. Data analysis revealed three key approaches which have been labelled: free digital play, guided digital play and teacher-led digital use. Importantly, teacher decisions were influenced by the interplay between the opportunities that the ICT tools offered, the teachers’ prior knowledge and experience about ICT and children’s learning needs and contexts. This paper is a snapshot of two early childhood settings, and further research will encompass data from six more early childhood settings in Victoria with the aim of exploring a wide range of motivating factors for early childhood teachers trying to integrate ICT into their programs.Keywords: early childhood education (ECE), digital play, information and communication technologies (ICT), play, and teachers' interaction approaches
Procedia PDF Downloads 21222045 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring
Authors: Daniel Fundi Murithi
Abstract:
Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring
Procedia PDF Downloads 16322044 Deepfake Detection for Compressed Media
Authors: Sushil Kumar Gupta, Atharva Joshi, Ayush Sonawale, Sachin Naik, Rajshree Khande
Abstract:
The usage of artificially created videos and audio by deep learning is a major problem of the current media landscape, as it pursues the goal of misinformation and distrust. In conclusion, the objective of this work targets generating a reliable deepfake detection model using deep learning that will help detect forged videos accurately. In this work, CelebDF v1, one of the largest deepfake benchmark datasets in the literature, is adopted to train and test the proposed models. The data includes authentic and synthetic videos of high quality, therefore allowing an assessment of the model’s performance against realistic distortions.Keywords: deepfake detection, CelebDF v1, convolutional neural network (CNN), xception model, data augmentation, media manipulation
Procedia PDF Downloads 922043 The Impact of Financial Risk on Banks’ Financial Performance: A Comparative Study of Islamic Banks and Conventional Banks in Pakistan
Authors: Mohammad Yousaf Safi Mohibullah Afghan
Abstract:
The study made on Islamic and conventional banks scrutinizes the risks interconnected with credit and liquidity on the productivity performance of Islamic and conventional banks that operate in Pakistan. Among the banks, only 4 Islamic and 18 conventional banks have been selected to enrich the result of our study on Islamic banks performance in connection to conventional banks. The selection of the banks to the panel is based on collecting quarterly unbalanced data ranges from the first quarter of 2007 to the last quarter of 2017. The data are collected from the Bank’s web sites and State Bank of Pakistan. The data collection is carried out based on Delta-method test. The mentioned test is used to find out the empirical results. In the study, while collecting data on the banks, the return on assets and return on equity have been major factors that are used assignificant proxies in determining the profitability of the banks. Moreover, another major proxy is used in measuring credit and liquidity risks, the loan loss provision to total loan and the ratio of liquid assets to total liability. Meanwhile, with consideration to the previous literature, some other variables such as bank size, bank capital, bank branches, and bank employees have been used to tentatively control the impact of those factors whose direct and indirect effects on profitability is understood. In conclusion, the study emphasizes that credit risk affects return on asset and return on equity positively, and there is no significant difference in term of credit risk between Islamic and conventional banks. Similarly, the liquidity risk has a significant impact on the bank’s profitability, though the marginal effect of liquidity risk is higher for Islamic banks than conventional banks.Keywords: islamic & conventional banks, performance return on equity, return on assets, pakistan banking sectors, profitibility
Procedia PDF Downloads 16322042 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand
Authors: Manit Pollar
Abstract:
Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.Keywords: SARIMA, time series model, dengue cases, Thailand
Procedia PDF Downloads 35822041 Spatial Mapping of Variations in Groundwater of Taluka Islamkot Thar Using GIS and Field Data
Authors: Imran Aziz Tunio
Abstract:
Islamkot is an underdeveloped sub-district (Taluka) in the Tharparkar district Sindh province of Pakistan located between latitude 24°25'19.79"N to 24°47'59.92"N and longitude 70° 1'13.95"E to 70°32'15.11"E. The Islamkot has an arid desert climate and the region is generally devoid of perennial rivers, canals, and streams. It is highly dependent on rainfall which is not considered a reliable surface water source and groundwater is the only key source of water for many centuries. To assess groundwater’s potential, an electrical resistivity survey (ERS) was conducted in Islamkot Taluka. Groundwater investigations for 128 Vertical Electrical Sounding (VES) were collected to determine the groundwater potential and obtain qualitatively and quantitatively layered resistivity parameters. The PASI Model 16 GL-N Resistivity Meter was used by employing a Schlumberger electrode configuration, with half current electrode spacing (AB/2) ranging from 1.5 to 100 m and the potential electrode spacing (MN/2) from 0.5 to 10 m. The data was acquired with a maximum current electrode spacing of 200 m. The data processing for the delineation of dune sand aquifers involved the technique of data inversion, and the interpretation of the inversion results was aided by the use of forward modeling. The measured geo-electrical parameters were examined by Interpex IX1D software, and apparent resistivity curves and synthetic model layered parameters were mapped in the ArcGIS environment using the inverse Distance Weighting (IDW) interpolation technique. Qualitative interpretation of vertical electrical sounding (VES) data shows the number of geo-electrical layers in the area varies from three to four with different resistivity values detected. Out of 128 VES model curves, 42 nos. are 3 layered, and 86 nos. are 4 layered. The resistivity of the first subsurface layers (Loose surface sand) varied from 16.13 Ωm to 3353.3 Ωm and thickness varied from 0.046 m to 17.52m. The resistivity of the second subsurface layer (Semi-consolidated sand) varied from 1.10 Ωm to 7442.8 Ωm and thickness varied from 0.30 m to 56.27 m. The resistivity of the third subsurface layer (Consolidated sand) varied from 0.00001 Ωm to 3190.8 Ωm and thickness varied from 3.26 m to 86.66 m. The resistivity of the fourth subsurface layer (Silt and Clay) varied from 0.0013 Ωm to 16264 Ωm and thickness varied from 13.50 m to 87.68 m. The Dar Zarrouk parameters, i.e. longitudinal unit conductance S is from 0.00024 to 19.91 mho; transverse unit resistance T from 7.34 to 40080.63 Ωm2; longitudinal resistance RS is from 1.22 to 3137.10 Ωm and transverse resistivity RT from 5.84 to 3138.54 Ωm. ERS data and Dar Zarrouk parameters were mapped which revealed that the study area has groundwater potential in the subsurface.Keywords: electrical resistivity survey, GIS & RS, groundwater potential, environmental assessment, VES
Procedia PDF Downloads 11022040 Precipitation Intensity: Duration Based Threshold Analysis for Initiation of Landslides in Upper Alaknanda Valley
Authors: Soumiya Bhattacharjee, P. K. Champati Ray, Shovan L. Chattoraj, Mrinmoy Dhara
Abstract:
The entire Himalayan range is globally renowned for rainfall-induced landslides. The prime focus of the study is to determine rainfall based threshold for initiation of landslides that can be used as an important component of an early warning system for alerting stake holders. This research deals with temporal dimension of slope failures due to extreme rainfall events along the National Highway-58 from Karanprayag to Badrinath in the Garhwal Himalaya, India. Post processed 3-hourly rainfall intensity data and its corresponding duration from daily rainfall data available from Tropical Rainfall Measuring Mission (TRMM) were used as the prime source of rainfall data. Landslide event records from Border Road Organization (BRO) and some ancillary landslide inventory data for 2013 and 2014 have been used to determine Intensity Duration (ID) based rainfall threshold. The derived governing threshold equation, I= 4.738D-0.025, has been considered for prediction of landslides of the study region. This equation was validated with an accuracy of 70% landslides during August and September 2014. The derived equation was considered for further prediction of landslides of the study region. From the obtained results and validation, it can be inferred that this equation can be used for initiation of landslides in the study area to work as a part of an early warning system. Results can significantly improve with ground based rainfall estimates and better database on landslide records. Thus, the study has demonstrated a very low cost method to get first-hand information on possibility of impending landslide in any region, thereby providing alert and better preparedness for landslide disaster mitigation.Keywords: landslide, intensity-duration, rainfall threshold, TRMM, slope, inventory, early warning system
Procedia PDF Downloads 273