Search results for: web usage data
23760 Functional and Efficient Query Interpreters: Principle, Application and Performances’ Comparison
Authors: Laurent Thiry, Michel Hassenforder
Abstract:
This paper presents a general approach to implement efficient queries’ interpreters in a functional programming language. Indeed, most of the standard tools actually available use an imperative and/or object-oriented language for the implementation (e.g. Java for Jena-Fuseki) but other paradigms are possible with, maybe, better performances. To proceed, the paper first explains how to model data structures and queries in a functional point of view. Then, it proposes a general methodology to get performances (i.e. number of computation steps to answer a query) then it explains how to integrate some optimization techniques (short-cut fusion and, more important, data transformations). It then compares the functional server proposed to a standard tool (Fuseki) demonstrating that the first one can be twice to ten times faster to answer queries.Keywords: data transformation, functional programming, information server, optimization
Procedia PDF Downloads 15823759 Dimension Free Rigid Point Set Registration in Linear Time
Authors: Jianqin Qu
Abstract:
This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.Keywords: covariant point, point matching, dimension free, rigid registration
Procedia PDF Downloads 16823758 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things
Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker
Abstract:
Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data
Procedia PDF Downloads 33423757 Analysis of Cascade Control Structure in Train Dynamic Braking System
Authors: B. Moaveni, S. Morovati
Abstract:
In recent years, increasing the usage of railway transportations especially in developing countries caused more attention to control systems railway vehicles. Consequently, designing and implementing the modern control systems to improve the operating performance of trains and locomotives become one of the main concerns of researches. Dynamic braking systems is an important safety system which controls the amount of braking torque generated by traction motors, to keep the adhesion coefficient between the wheel-sets and rail road in optimum bound. Adhesion force has an important role to control the braking distance and prevent the wheels from slipping during the braking process. Cascade control structure is one of the best control methods for the wide range of industrial plants in the presence of disturbances and errors. This paper presents cascade control structure based on two forward simple controllers with two feedback loops to control the slip ratio and braking torque. In this structure, the inner loop controls the angular velocity and the outer loop control the longitudinal velocity of the locomotive that its dynamic is slower than the dynamic of angular velocity. This control structure by controlling the torque of DC traction motors, tries to track the desired velocity profile to access the predefined braking distance and to control the slip ratio. Simulation results are employed to show the effectiveness of the introduced methodology in dynamic braking system.Keywords: cascade control, dynamic braking system, DC traction motors, slip control
Procedia PDF Downloads 36623756 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes
Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani
Abstract:
The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning
Procedia PDF Downloads 40423755 Development of Sustainable Farming Compartment with Treated Wastewater in Abu Dhabi
Authors: Jongwan Eun, Sam Helwany, Lakshyana K. C.
Abstract:
The United Arab Emirates (UAE) is significantly dependent on desalinated water and groundwater resource, which is expensive and highly energy intensive. Despite the scarce water resource, stagnates only 54% of the recycled water was reused in 2012, and due to the lack of infrastructure to reuse the recycled water, the portion is expected to decrease with growing water usage. In this study, an “Oasis” complex comprised of Sustainable Farming Compartments (SFC) was proposed for reusing treated wastewater. The wastewater is used to decrease the ambient temperature of the SFC via an evaporative cooler. The SFC prototype was designed, built, and tested in an environmentally controlled laboratory and field site to evaluate the feasibility and effectiveness of the SFC subjected to various climatic conditions in Abu Dhabi. Based on the experimental results, the temperature drop achieved in the SFC in the laboratory and field site were5 ̊C from 22 ̊C and 7- 15 ̊C (from 33-45 ̊C to average 28 ̊C at relative humidity < 50%), respectively. An energy simulation using TRNSYS was performed to extend and validate the results obtained from the experiment. The results from the energy simulation and experiments show statistically close agreement. The total power consumption of the SFC system was approximately three and a half times lower than that of an electrical air conditioner. Therefore, by using treated wastewater, the SFC has a promising prospect to solve Abu Dhabi’s ecological concern related to desertification and wind erosion.Keywords: ecological farming system, energy simulation, evaporative cooling system, temperature, treated waste water, temperature
Procedia PDF Downloads 25023754 Resin-coated Controlled Release Fertilizer (CRF) for Oil Palm: Laboratory and Main Nursery Evaluation
Authors: Umar Adli Amran, Tan Choon Chek, Mohd Shahkhirat Norizan, Then Kek Hoe
Abstract:
Controlled release fertilizer (CRF) enables a regulated nutrients release for more efficient plant uptake compared to the normal granular fertilizer. It reduces nutrients loss via surface run-off and leaching, hence promotes sustainable agriculture. Although the performance of CRF in providing consistent and timely nutrients supply is well known, its expensive price limits it usage in a large scale plantation. This study is conducted to evaluate the properties and performance of bio-based polyurethane (PU)-coated CRF via laboratory and oil palm main nursery trial. The CRF is produced by coating of a normal commercial compound granular fertilizer from FGV Fertiliser Sdn. Bhd., namely Felda 10 (10.5-8-20-3+0.5B), and designated as CRF FGV10. Based on laboratory evaluation, the CRF FGV10 can sustain nutrients release for more than 6 months. Vegetative growth parameters such as girth size, palm height, third frond length, and the total number of fronds produced were recorded. Besides that, dry biomass of the oil palm seedlings was also determined. From the evaluation, it is proved that at 50% reduction of nutrients application rate and for only two times application (T3), CRF FGV10 enabled the oil palm seedlings to achieve similar vegetative growth with the control samples (T1). It is also proven that only PU-coated CRF FGV10 had allowed the reduction of fertilizer rate and application rounds.Keywords: nutrition, oil palm seedlings, polyurethane, sustainable manuring, vegetative growth
Procedia PDF Downloads 6123753 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan
Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail
Abstract:
Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.Keywords: credibility, decision making, food bloggers, generation z, e-wom
Procedia PDF Downloads 7323752 Performance Measurement of Logistics Systems for Thailand's Wholesales and Retails Industries by Data Envelopment Analysis
Authors: Pornpimol Chaiwuttisak
Abstract:
The study aims to compare the performance of the logistics for Thailand’s wholesale and retail trade industries (except motor vehicles, motorcycle, and stalls) by using data (data envelopment analysis). Thailand Standard Industrial Classification in 2009 (TSIC - 2009) categories that industries into sub-group no. 45: wholesale and retail trade (except for the repair of motor vehicles and motorcycles), sub-group no. 46: wholesale trade (except motor vehicles and motorcycles), and sub-group no. 47: retail trade (except motor vehicles and motorcycles. Data used in the study is collected by the National Statistical Office, Thailand. The study consisted of four input factors include the number of companies, the number of personnel in logistics, the training cost in logistics, and outsourcing logistics management. Output factor includes the percentage of enterprises having inventory management. The results showed that the average relative efficiency of small-sized enterprises equals to 27.87 percent and 49.68 percent for the medium-sized enterprises.Keywords: DEA, wholesales and retails, logistics, Thailand
Procedia PDF Downloads 41623751 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 9723750 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea
Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi
Abstract:
Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow
Procedia PDF Downloads 12323749 Application of Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM) Database in Nursing Health Problems with Prostate Cancer-a Pilot Study
Authors: Hung Lin-Zin, Lai Mei-Yen
Abstract:
Prostate cancer is the most commonly diagnosed male cancer in the U.S. The prevalence is around 1 in 8. The etiology of prostate cancer is still unknown, but some predisposing factors, such as age, black race, family history, and obesity, may increase the risk of the disease. In 2020, a total of 7,178 Taiwanese people were nearly diagnosed with prostate cancer, accounting for 5.88% of all cancer cases, and the incidence rate ranked fifth among men. In that year, the total number of deaths from prostate cancer was 1,730, accounting for 3.45% of all cancer deaths, and the death rate ranked 6th among men, accounting for 94.34% of the cases of male reproductive organs. Looking for domestic and foreign literature on the use of OMOP (Observational Medical Outcomes Partnership, hereinafter referred to as OMOP) database analysis, there are currently nearly a hundred literature published related to nursing-related health problems and nursing measures built in the OMOP general data model database of medical institutions are extremely rare. The OMOP common data model construction analysis platform is a system developed by the FDA in 2007, using a common data model (common data model, CDM) to analyze and monitor healthcare data. It is important to build up relevant nursing information from the OMOP- CDM database to assist our daily practice. Therefore, we choose prostate cancer patients who are our popular care objects and use the OMOP- CDM database to explore the common associated health problems. With the assistance of OMOP-CDM database analysis, we can expect early diagnosis and prevention of prostate cancer patients' comorbidities to improve patient care.Keywords: OMOP, nursing diagnosis, health problem, prostate cancer
Procedia PDF Downloads 6923748 Investigation of Learning Challenges in Building Measurement Unit
Authors: Argaw T. Gurmu, Muhammad N. Mahmood
Abstract:
The objective of this research is to identify the architecture and construction management students’ learning challenges of the building measurement. This research used the survey data obtained collected from the students who completed the building measurement unit. NVivo qualitative data analysis software was used to identify relevant themes. The analysis of the qualitative data revealed the major learning difficulties such as inadequacy of practice questions for the examination, inability to work as a team, lack of detailed understanding of the prerequisite units, insufficiency of the time allocated for tutorials and incompatibility of lecture and tutorial schedules. The output of this research can be used as a basis for improving the teaching and learning activities in construction measurement units.Keywords: building measurement, construction management, learning challenges, evaluate survey
Procedia PDF Downloads 13923747 Using Data-Driven Model on Online Customer Journey
Authors: Ing-Jen Hung, Tzu-Chien Wang
Abstract:
Nowadays, customers can interact with firms through miscellaneous online ads on different channels easily. In other words, customer now has innumerable options and limitless time to accomplish their commercial activities with firms, individualizing their own online customer journey. This kind of convenience emphasizes the importance of online advertisement allocation on different channels. Therefore, profound understanding of customer behavior can make considerable benefit from optimizing fund allocation on diverse ad channels. To achieve this objective, multiple firms utilize numerical methodology to create data-driven advertisement policy. In our research, we aim to exploit online customer click data to discover the correlations between each channel and their sequential relations. We use LSTM to deal with sequential property of our data and compare its accuracy with other non-sequential methods, such as CART decision tree, logistic regression, etc. Besides, we also classify our customers into several groups by their behavioral characteristics to perceive the differences between all groups as customer portrait. As a result, we discover distinct customer journey under each customer portrait. Our article provides some insights into marketing research and can help firm to formulate online advertising criteria.Keywords: LSTM, customer journey, marketing, channel ads
Procedia PDF Downloads 12123746 Study for Utilization of Industrial Solid Waste, Generated by the Discharge of Casting Sand Agglomeration with Clay, Blast Furnace Slag and Sugar Cane Bagasse Ash in Concrete Composition
Authors: Mario Sergio de Andrade Zago, Javier Mazariegos Pablos, Eduvaldo Paulo Sichieri
Abstract:
This research project accomplished a study on the technical feasibility of recycling industrial solid waste generated by the discharge of casting sand agglomeration with clay, blast furnace slag and sugar cane bagasse ash. For this, the plan proposed a methodology that initially establishes a process of solid waste encapsulation, by using solidification/stabilization technique on Portland cement matrices, in which the residuals act as small and large aggregates on the composition of concrete, and later it presents the possibility of using this concrete in the manufacture of concrete pieces (concrete blocks) for paving. The results obtained in this research achieved the objective set with great success, regarding the manufacturing of concrete pieces (blocks) for paving urban roads, whenever there is special vehicle traffic or demands capable of producing accentuated abrasion effects (surpassing the 50 MPa required by the regulation), which probes the technical practicability of using waste from sand casting agglomeration with clay and blast furnace slag used in this study, unlocking usage possibilities for construction.Keywords: industrial solid waste, solidification/stabilization, Portland cement, reuse, bagasse ash in the sugar cane, concrete
Procedia PDF Downloads 30223745 Valorization of Banana Peels for Mercury Removal in Environmental Realist Conditions
Authors: E. Fabre, C. Vale, E. Pereira, C. M. Silva
Abstract:
Introduction: Mercury is one of the most troublesome toxic metals responsible for the contamination of the aquatic systems due to its accumulation and bioamplification along the food chain. The 2030 agenda for sustainable development of United Nations promotes the improving of water quality by reducing water pollution and foments an enhance in wastewater treatment, encouraging their recycling and safe water reuse globally. Sorption processes are widely used in wastewater treatments due to their many advantages such as high efficiency and low operational costs. In these processes the target contaminant is removed from the solution by a solid sorbent. The more selective and low cost is the biosorbent the more attractive becomes the process. Agricultural wastes are especially attractive approaches for sorption. They are largely available, have no commercial value and require little or no processing. In this work, banana peels were tested for mercury removal from low concentrated solutions. In order to investigate the applicability of this solid, six water matrices were used increasing the complexity from natural waters to a real wastewater. Studies of kinetics and equilibrium were also performed using the most known models to evaluate the viability of the process In line with the concept of circular economy, this study adds value to this by-product as well as contributes to liquid waste management. Experimental: The solutions were prepared with Hg(II) initial concentration of 50 µg L-1 in natural waters, at 22 ± 1 ºC, pH 6, magnetically stirring at 650 rpm and biosorbent mass of 0.5 g L-1. NaCl was added to obtain the salt solutions, seawater was collected from the Portuguese coast and the real wastewater was kindly provided by ISQ - Instituto de Soldadura e qualidade (Welding and Quality Institute) and diluted until the same concentration of 50 µg L-1. Banana peels were previously freeze-drying, milled, sieved and the particles < 1 mm were used. Results: Banana peels removed more than 90% of Hg(II) from all the synthetic solutions studied. In these cases, the enhance in the complexity of the water type promoted a higher mercury removal. In salt waters, the biosorbent showed removals of 96%, 95% and 98 % for 3, 15 and 30 g L-1 of NaCl, respectively. The residual concentration of Hg(II) in solution achieved the level of drinking water regulation (1 µg L-1). For real matrices, the lower Hg(II) elimination (93 % for seawater and 81 % for the real wastewaters), can be explained by the competition between the Hg(II) ions and the other elements present in these solutions for the sorption sites. Regarding the equilibrium study, the experimental data are better described by the Freundlich isotherm (R ^ 2=0.991). The Elovich equation provided the best fit to the kinetic points. Conclusions: The results exhibited the great ability of the banana peels to remove mercury. The environmental realist conditions studied in this work, highlight their potential usage as biosorbents in water remediation processes.Keywords: banana peels, mercury removal, sorption, water treatment
Procedia PDF Downloads 15523744 Energy Efficient Plant Design Approaches: Case Study of the Sample Building of the Energy Efficiency Training Facilities
Authors: Idil Kanter Otcu
Abstract:
Nowadays, due to the growing problems of energy supply and the drastic reduction of natural non-renewable resources, the development of new applications in the energy sector and steps towards greater efficiency in energy consumption are required. Since buildings account for a large share of energy consumption, increasing the structural density of buildings causes an increase in energy consumption. This increase in energy consumption means that energy efficiency approaches to building design and the integration of new systems using emerging technologies become necessary in order to curb this consumption. As new systems for productive usage of generated energy are developed, buildings that require less energy to operate, with rational use of resources, need to be developed. One solution for reducing the energy requirements of buildings is through landscape planning, design and application. Requirements such as heating, cooling and lighting can be met with lower energy consumption through planting design, which can help to achieve more efficient and rational use of resources. Within this context, rather than a planting design which considers only the ecological and aesthetic features of plants, these considerations should also extend to spatial organization whereby the relationship between the site and open spaces in the context of climatic elements and planting designs are taken into account. In this way, the planting design can serve an additional purpose. In this study, a landscape design which takes into consideration location, local climate morphology and solar angle will be illustrated on a sample building project.Keywords: energy efficiency, landscape design, plant design, xeriscape landscape
Procedia PDF Downloads 26123743 A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System
Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi
Abstract:
Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.Keywords: proxy signature, fault tolerance, rsa, key agreement protocol
Procedia PDF Downloads 28623742 Estimating the Receiver Operating Characteristic Curve from Clustered Data and Case-Control Studies
Authors: Yalda Zarnegarnia, Shari Messinger
Abstract:
Receiver operating characteristic (ROC) curves have been widely used in medical research to illustrate the performance of the biomarker in correctly distinguishing the diseased and non-diseased groups. Correlated biomarker data arises in study designs that include subjects that contain same genetic or environmental factors. The information about correlation might help to identify family members at increased risk of disease development, and may lead to initiating treatment to slow or stop the progression to disease. Approaches appropriate to a case-control design matched by family identification, must be able to accommodate both the correlation inherent in the design in correctly estimating the biomarker’s ability to differentiate between cases and controls, as well as to handle estimation from a matched case control design. This talk will review some developed methods for ROC curve estimation in settings with correlated data from case control design and will discuss the limitations of current methods for analyzing correlated familial paired data. An alternative approach using Conditional ROC curves will be demonstrated, to provide appropriate ROC curves for correlated paired data. The proposed approach will use the information about the correlation among biomarker values, producing conditional ROC curves that evaluate the ability of a biomarker to discriminate between diseased and non-diseased subjects in a familial paired design.Keywords: biomarker, correlation, familial paired design, ROC curve
Procedia PDF Downloads 24023741 An Evaluation of Neural Network Efficacies for Image Recognition on Edge-AI Computer Vision Platform
Abstract:
Image recognition, as one of the most critical technologies in computer vision, works to help machine-like robotics understand a scene, that is, if deployed appropriately, will trigger the revolution in remote sensing and industry automation. With the developments of AI technologies, there are many prevailing and sophisticated neural networks as technologies developed for image recognition. However, computer vision platforms as hardware, supporting neural networks for image recognition, as crucial as the neural network technologies, need to be more congruently addressed as the research subjects. In contrast, different computer vision platforms are deterministic to leverage the performance of different neural networks for recognition. In this paper, three different computer vision platforms – Jetson Nano(with 4GB), a standalone laptop(with RTX 3000s, using CUDA), and Google Colab (web-based, using GPU) are explored and four prominent neural network architectures (including AlexNet, VGG(16/19), GoogleNet, and ResNet(18/34/50)), are investigated. In the context of pairwise usage between different computer vision platforms and distinctive neural networks, with the merits of recognition accuracy and time efficiency, the performances are evaluated. In the case study using public imageNets, our findings provide a nuanced perspective on optimizing image recognition tasks across Edge-AI platforms, offering guidance on selecting appropriate neural network structures to maximize performance under hardware constraints.Keywords: alexNet, VGG, googleNet, resNet, Jetson nano, CUDA, COCO-NET, cifar10, imageNet large scale visual recognition challenge (ILSVRC), google colab
Procedia PDF Downloads 9023740 Code-Switching among Local UCSI Stem and N-Stem Undergraduates during Knowledge Sharing
Authors: Adeela Abu Bakar, Minder Kaur, Parthaman Singh
Abstract:
In the Malaysian education system, a formal setting of English language learning takes place in a content-based classroom (CBC). Until recently, there is less study in Malaysia, which researched the effects of code-switching (CS) behaviour towards the students’ knowledge sharing (KS) with their peers. The aim of this study is to investigate the frequency, reasons, and effect that CS, from the English language to Bahasa Melayu, has among local STEM and N-STEM undergraduates towards KS in a content-based classroom. The study implies a mixed-method research design with questionnaire and interviews as the instruments. The data is collected through distribution of questionnaires and interviews with the undergraduates. The quantitative data is analysed using SPSS in simple frequencies and percentages, whereas qualitative data involves organizing the data into themes, followed by analysis. Findings found that N-STEM undergraduates code-switch more as compared to STEM undergraduates. In addition to that, both the STEM and N-STEM undergraduates agree that CS acts as a catalyst towards KS in a content-based classroom. However, they also acknowledge that excess use of CS can be a hindrance towards KS. The findings of the study can benefit STEM and N-STEM undergraduates, education policymakers, language teachers, university educators, and students with significant insights into the role of CS towards KS in a content-based classroom. Some of the recommendations that can be applied for future studies are that the number of participants can be increased, an observation to be included for the data collection.Keywords: switching, content-based classroom, content and language integrated learning, knowledge sharing, STEM and N-STEM undergraduates
Procedia PDF Downloads 13523739 The Effect of Language and Literature Integration on the Teaching of English Vocabulary and Grammar in Secondary Schools in Zamfara State, Nigeria
Authors: Umar Bello
Abstract:
Literature has become an invaluable subject which has added a great value and contribution to the teaching of English language and the discovery of many other developed ideas. Literature produces an exhilarating impulse that imprints a lasting picture on the mind of a learner. Many researchers have devised various means and approaches to language Teaching methods which remain unconvinging and which yield little result, but it has remained unconvincing because it has only produced little results. Devicing a method that eliminates monotony and boredome to learners is a good factor that enhances students’ motivation to learning. In this sense, literature and language become unavoidable components that aid intellectual development. This study examines the indispensability of literature as a means of English Language teaching to secondary school classes. The researcher has developed many instructive activities which are believed will help students to improve their study in grammar and vocabulary. The researcher has used quasi-experimental approach using experimental group and control group to find out how literature enhances the students grammar as well as their vocabulary. The findings revealed a positive performance in the experimental group doing better than the control group using simple percentage. The results make it clear that literature allows learners to pay more attention and develop more interest to their studies. In giving a perspicacious linguistic development, literature therefore remains an essential tool for language teaching classrooms, thereby enhancing their grammatical and vocabulary usage.Keywords: teaching vocabulary, integration, poetry, classroom
Procedia PDF Downloads 10423738 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources
Authors: Jolly Puri, Shiv Prasad Yadav
Abstract:
Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.Keywords: multi-component DEA, fuzzy multi-component DEA, fuzzy resources, decision making units (DMUs)
Procedia PDF Downloads 40723737 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis
Procedia PDF Downloads 38623736 AniMoveMineR: Animal Behavior Exploratory Analysis Using Association Rules Mining
Authors: Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro L. Pizzigatti Corrła Ronaldo G. Morato
Abstract:
Environmental changes and major natural disasters are most prevalent in the world due to the damage that humanity has caused to nature and these damages directly affect the lives of animals. Thus, the study of animal behavior and their interactions with the environment can provide knowledge that guides researchers and public agencies in preservation and conservation actions. Exploratory analysis of animal movement can determine the patterns of animal behavior and with technological advances the ability of animals to be tracked and, consequently, behavioral studies have been expanded. There is a lot of research on animal movement and behavior, but we note that a proposal that combines resources and allows for exploratory analysis of animal movement and provide statistical measures on individual animal behavior and its interaction with the environment is missing. The contribution of this paper is to present the framework AniMoveMineR, a unified solution that aggregates trajectory analysis and data mining techniques to explore animal movement data and provide a first step in responding questions about the animal individual behavior and their interactions with other animals over time and space. We evaluated the framework through the use of monitored jaguar data in the city of Miranda Pantanal, Brazil, in order to verify if the use of AniMoveMineR allows to identify the interaction level between these jaguars. The results were positive and provided indications about the individual behavior of jaguars and about which jaguars have the highest or lowest correlation.Keywords: data mining, data science, trajectory, animal behavior
Procedia PDF Downloads 14423735 Thiosulfate Leaching of the Auriferous Ore from Castromil Deposit: A Case Study
Authors: Rui Sousa, Aurora Futuro, António Fiúza
Abstract:
The exploitation of gold ore deposits is highly dependent on efficient mineral processing methods, although actual perspectives based on life-cycle assessment introduce difficulties that were unforeseen in a very recent past. Cyanidation is the most applied gold processing method, but the potential environmental problems derived from the usage of cyanide as leaching reagent led to a demand for alternative methods. Ammoniacal thiosulfate leaching is one of the most important alternatives to cyanidation. In this article, some experimental studies carried out in order to assess the feasibility of thiosulfate as a leaching agent for the ore from the unexploited Portuguese gold mine of Castromil. It became clear that the process depends on the concentrations of ammonia, thiosulfate and copper. Based on this fact, a few leaching tests were performed in order to assess the best reagent prescription, and also the effects of different combination of these concentrations. Higher thiosulfate concentrations cause the decrease of gold dissolution. Lower concentrations of ammonia require higher thiosulfate concentrations, and higher ammonia concentrations require lower thiosulfate concentrations. The addition of copper increases the gold dissolution ratio. Subsequently, some alternative operatory conditions were tested such as variations in temperature and in the solid/liquid ratio as well as the application of a pre-treatment before the leaching stage. Finally, thiosulfate leaching was compared to cyanidation. Thiosulfate leaching showed to be an important alternative, although a pre-treatment is required to increase the yield of the gold dissolution.Keywords: gold, leaching, pre-treatment, thiosulfate
Procedia PDF Downloads 31123734 A Study on Using Network Coding for Packet Transmissions in Wireless Sensor Networks
Authors: Rei-Heng Cheng, Wen-Pinn Fang
Abstract:
A wireless sensor network (WSN) is composed by a large number of sensors and one or a few base stations, where the sensor is responsible for detecting specific event information, which is sent back to the base station(s). However, how to save electricity consumption to extend the network lifetime is a problem that cannot be ignored in the wireless sensor networks. Since the sensor network is used to monitor a region or specific events, how the information can be reliably sent back to the base station is surly important. Network coding technique is often used to enhance the reliability of the network transmission. When a node needs to send out M data packets, it encodes these data with redundant data and sends out totally M + R packets. If the receiver can get any M packets out from these M + R packets, it can decode and get the original M data packets. To transmit redundant packets will certainly result in the excess energy consumption. This paper will explore relationship between the quality of wireless transmission and the number of redundant packets. Hopefully, each sensor can overhear the nearby transmissions, learn the wireless transmission quality around it, and dynamically determine the number of redundant packets used in network coding.Keywords: energy consumption, network coding, transmission reliability, wireless sensor networks
Procedia PDF Downloads 39123733 Pattern the Location and Area of Earth-Dumping Stations from Vehicle GPS Data in Taiwan
Authors: Chun-Yuan Chen, Ming-Chang Li, Xiu-Hui Wen, Yi-Ching Tu
Abstract:
The objective of this study explores GPS (Global Positioning System) applied to trace construction vehicles such as trucks or cranes, help to pattern the earth-dumping stations of traffic construction in Taiwan. Traffic construction in this research is defined as the engineering of high-speed railways, expressways, and which that distance more than kilometers. Audit the location and check the compliance with regulations of earth-dumping stations is one of important tasks in Taiwan EPA. Basically, the earth-dumping station was known as one source of particulate matter from air pollution during construction process. Due to GPS data can be analyzed quickly and be used conveniently, this study tried to find out dumping stations by modeling vehicles tracks from GPS data during work cycle of construction. The GPS data updated from 13 vehicles related to an expressway construction in central Taiwan. The GPS footprints were retrieved to Keyhole Markup Language (KML) files so that can pattern the tracks of trucks by computer applications, the data was collected about eight months- from Feb. to Oct. in 2017. The results of GPS footprints identified dumping station and outlined the areas of earthwork had been passed to the Taiwan EPA for on-site inspection. Taiwan EPA had issued advice comments to the agency which was in charge of the construction to prevent the air pollution. According to the result of this study compared to the commonly methods in inspecting environment by manual collection, the GPS with KML patterning and modeling method can consumes less time. On the other hand, through monitoring the GPS data from construction vehicles could be useful for administration to development and implementation of strategies in environmental management.Keywords: automatic management, earth-dumping station, environmental management, Global Positioning System (GPS), particulate matter, traffic construction
Procedia PDF Downloads 16423732 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies
Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan
Abstract:
The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping
Procedia PDF Downloads 9823731 Effect of Bank Specific and Macro Economic Factors on Credit Risk of Islamic Banks in Pakistan
Authors: Mati Ullah, Shams Ur Rahman
Abstract:
The purpose of this research study is to investigate the effect of macroeconomic and bank-specific factors on credit risk in Islamic banking in Pakistan. The future of financial institutions largely depends on how well they manage risks. Credit risk is an important type of risk affecting the banking sector. The current study has taken quarterly data for the period of 6 years, from 1st July 2014 to 30 Jun 2020. The data set consisted of secondary data. Data was extracted from the websites of the State Bank and World Bank and from the financial statements of the concerned banks. In this study, the Ordinary least square model was used for the analysis of the data. The results supported the hypothesis that macroeconomic factors and bank-specific factors have a significant effect on credit risk. Macroeconomic variables, Inflation and exchange rates have positive significant effects on credit risk. However, gross domestic product has a negative significant relationship with credit risk. Moreover, the corporate rate has no significant relation with credit risk. Internal variables, size, management efficiency, net profit share income and capital adequacy have been proven to influence positively and significantly the credit risk. However, loan to deposit-has a negative insignificance relationship with credit risk. The contribution of this article is that similar conclusions have been made regarding the influence of banking factors on credit risk.Keywords: credit risk, Islamic banks, macroeconomic variables, banks specific variable
Procedia PDF Downloads 19