Search results for: data mining technique
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9811

Search results for: data mining technique

9001 A Block Cipher for Resource-Constrained IoT Devices

Authors: Muhammad Rana, Quazi Mamun, Rafiqul Islam

Abstract:

In the Internet of Things (IoT), many devices are connected and accumulate a sheer amount of data. These Internet-driven raw data need to be transferred securely to the end-users via dependable networks. Consequently, the challenges of IoT security in various IoT domains are paramount. Cryptography is being applied to secure the networks for authentication, confidentiality, data integrity and access control. However, due to the resource constraint properties of IoT devices, the conventional cipher may not be suitable in all IoT networks. This paper designs a robust and effective lightweight cipher to secure the IoT environment and meet the resource-constrained nature of IoT devices. We also propose a symmetric and block-cipher based lightweight cryptographic algorithm. The proposed algorithm increases the complexity of the block cipher, maintaining the lowest computational requirements possible. The proposed algorithm efficiently constructs the key register updating technique, reduces the number of encryption rounds, and adds a layer between the encryption and decryption processes.

Keywords: Internet of Things, IoT, cryptography block cipher, s-box, key management, IoT security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539
9000 A Problem in Microstretch Thermoelastic Diffusive Medium

Authors: Devinder Singh, Arbind Kumar, Rajneesh Kumar

Abstract:

The general solution of the equations for a homogeneous isotropic microstretch thermo elastic medium with mass diffusion for two dimensional problems is obtained due to normal and tangential forces. The Integral transform technique is used to obtain the components of displacements, microrotation, stress and mass concentration, temperature change and mass concentration. A particular case of interest is deduced from the present investigation.

Keywords: Normal and tangential force, Microstretch, thermoelastic, The Integral transform technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2298
8999 LIFirr with an Indicator of Microbial Activity in Paraffinic Oil

Authors: M. P. Casiraghi, C. M. Quintella, P. Almeida

Abstract:

Paraffinic oils were submitted to microbial action. The microorganisms consisted of bacteria of the genera Pseudomonas sp. and Bacillus lincheniforms. The alterations in interfacial tension were determined using a tensometer and applying the hanging drop technique at room temperature (299 K ±275 K). The alteration in the constitution of the paraffins was evaluated by means of gas chromatography. The microbial activity was observed to reduce interfacial tension by 54 to 78%, as well as consuming the paraffins C19 to C29 and producing paraffins C36 to C44. The LIFirr technique made it possible to determine the microbial action quickly.

Keywords: Paraffins, biosurfactants, LIFirr.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2348
8998 Strict Stability of Fuzzy Differential Equations with Impulse Effect

Authors: Sanjay K.Srivastava, Bhanu Gupta

Abstract:

In this paper some results on strict stability heve beeb extended for fuzzy differential equations with impulse effect using Lyapunov functions and Razumikhin technique.

Keywords: Fuzzy differential equations, Impulsive differential equations, Strict stability, Lyapunov function, Razumikhin technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469
8997 A New Vector Quantization Front-End Process for Discrete HMM Speech Recognition System

Authors: M. Debyeche, J.P Haton, A. Houacine

Abstract:

The paper presents a complete discrete statistical framework, based on a novel vector quantization (VQ) front-end process. This new VQ approach performs an optimal distribution of VQ codebook components on HMM states. This technique that we named the distributed vector quantization (DVQ) of hidden Markov models, succeeds in unifying acoustic micro-structure and phonetic macro-structure, when the estimation of HMM parameters is performed. The DVQ technique is implemented through two variants. The first variant uses the K-means algorithm (K-means- DVQ) to optimize the VQ, while the second variant exploits the benefits of the classification behavior of neural networks (NN-DVQ) for the same purpose. The proposed variants are compared with the HMM-based baseline system by experiments of specific Arabic consonants recognition. The results show that the distributed vector quantization technique increase the performance of the discrete HMM system.

Keywords: Hidden Markov Model, Vector Quantization, Neural Network, Speech Recognition, Arabic Language

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2056
8996 Lifelong Education for Teachers: A Tool for Achieving Effective Teaching and Learning in Secondary Schools in Benue State, Nigeria

Authors: P. I. Adzongo, O. A. Aloga

Abstract:

The purpose of the study was to examine lifelong education for teachers as a tool for achieving effective teaching and learning. Lifelong education enhances social inclusion, personal development, citizenship, employability, teaching and learning, community and the nation. It is imperative that the teacher needs to update his knowledge regularly to be able to perform optimally, since he has a major position in the inculcation of desirable elements in students, and the challenges of lifelong education were also discussed. Descriptive survey design was adopted for the study. A simple random sampling technique was used to select 80 teachers as sample from a population of 105 senior secondary school teachers in Makurdi Local Government Area of Benue State. A 20-item self designed questionnaire subjected to expert validation and reliability was used to collect data. The reliability Alpha coefficient of 0.87 was established using Cronbach’s Alpha technique, mean scores and standard deviation were used to answer the 2 research questions while chi-square was used to analyse data for the 2 null hypotheses, which states that lifelong education for teachers is not a significant tool for achieving effective teaching and lifelong education for teachers does not significantly impact on effective learning. The findings of the study revealed that, lifelong education for teachers can be used as a tool for achieving effective teaching and learning, and the study recommended among others that government, organizations and individuals should in collaboration put lifelong education programmes for teachers on the priority list. The paper concluded that the strategic position of lifelong education for teachers towards enhanced teaching, learning and the production of quality manpower in the society makes it imperative for all hands to be on “deck” to support the programme financially and otherwise.

Keywords: Lifelong Education, Tool, Effective Teaching and Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
8995 Designing Ontology-Based Knowledge Integration for Preprocessing of Medical Data in Enhancing a Machine Learning System for Coding Assignment of a Multi-Label Medical Text

Authors: Phanu Waraporn

Abstract:

This paper discusses the designing of knowledge integration of clinical information extracted from distributed medical ontologies in order to ameliorate a machine learning-based multilabel coding assignment system. The proposed approach is implemented using a decision tree technique of the machine learning on the university hospital data for patients with Coronary Heart Disease (CHD). The preliminary results obtained show a satisfactory finding that the use of medical ontologies improves the overall system performance.

Keywords: Medical Ontology, Knowledge Integration, Machine Learning, Medical Coding, Text Assignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
8994 Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac O. Asante, Yushi Jiang, Hailin Tao

Abstract:

Livestreaming marketing, the new electronic commerce element, has become an optional marketing channel following the COVID-19 pandemic, and many sellers are leveraging the features presented by livestreaming to increase sales. This study was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during livestreaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study presents a way of measuring interactions in livestreaming commerce and proposes a way to manually gather data on consumer behaviors in livestreaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: Livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148
8993 Analyzing Factors Impacting COVID-19 Vaccination Rates

Authors: Dongseok Cho, Mitchell Driedger, Sera Han, Noman Khan, Mohammed Elmorsy, Mohamad El-Hajj

Abstract:

Since the approval of the COVID-19 vaccine in late 2020, vaccination rates have varied around the globe. Access to a vaccine supply, mandated vaccination policy, and vaccine hesitancy contribute to these rates. This study used COVID-19 vaccination data from Our World in Data and the Multilateral Leaders Task Force on COVID-19 to create two COVID-19 vaccination indices. The first index is the Vaccine Utilization Index (VUI), which measures how effectively each country has utilized its vaccine supply to doubly vaccinate its population. The second index is the Vaccination Acceleration Index (VAI), which evaluates how efficiently each country vaccinated their populations within their first 150 days. Pearson correlations were created between these indices and country indicators obtained from the World Bank. Results of these correlations identify countries with stronger Health indicators such as lower mortality rates, lower age-dependency ratios, and higher rates of immunization to other diseases display higher VUI and VAI scores than countries with lesser values. VAI scores are also positively correlated to Governance and Economic indicators, such as regulatory quality, control of corruption, and GDP per capita. As represented by the VUI, proper utilization of the COVID-19 vaccine supply by country is observed in countries that display excellence in health practices. A country’s motivation to accelerate its vaccination rates within the first 150 days of vaccinating, as represented by the VAI, was largely a product of the governing body’s effectiveness and economic status, as well as overall excellence in health practises.

Keywords: Data mining, Pearson Correlation, COVID-19, vaccination rates, hesitancy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 345
8992 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 706
8991 Does Practice Reflect Theory? An Exploratory Study of a Successful Knowledge Management System

Authors: Janet L. Kourik, Peter E. Maher

Abstract:

To investigate the correspondence of theory and practice, a successfully implemented Knowledge Management System (KMS) is explored through the lens of Alavi and Leidner-s proposed KMS framework for the analysis of an information system in knowledge management (Framework-AISKM). The applied KMS system was designed to manage curricular knowledge in a distributed university environment. The motivation for the KMS is discussed along with the types of knowledge necessary in an academic setting. Elements of the KMS involved in all phases of capturing and disseminating knowledge are described. As the KMS matures the resulting data stores form the precursor to and the potential for knowledge mining. The findings from this exploratory study indicate substantial correspondence between the successful KMS and the theory-based framework providing provisional confirmation for the framework while suggesting factors that contributed to the system-s success. Avenues for future work are described.

Keywords: Applied KMS, education, knowledge management (KM), KM framework, knowledge management system (KMS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1037
8990 Analytical Studies on Volume Determination of Leg Ulcer using Structured Light and Laser Triangulation Data Acquisition Techniques

Authors: M. Abdul-Rani, K. K. Chong, A. F. M. Hani, Y. B. Yap, A. Jamil

Abstract:

Imaging is defined as the process of obtaining geometric images either two dimensional or three dimensional by scanning or digitizing the existing objects or products. In this research, it applied to retrieve 3D information of the human skin surface in medical application. This research focuses on analyzing and determining volume of leg ulcers using imaging devices. Volume determination is one of the important criteria in clinical assessment of leg ulcer. The volume and size of the leg ulcer wound will give the indication on responding to treatment whether healing or worsening. Different imaging techniques are expected to give different result (and accuracies) in generating data and images. Midpoint projection algorithm was used to reconstruct the cavity to solid model and compute the volume. Misinterpretation of the results can affect the treatment efficacy. The objectives of this paper is to compare the accuracy between two 3D data acquisition method, which is laser triangulation and structured light methods, It was shown that using models with known volume, that structured-light-based 3D technique produces better accuracy compared with laser triangulation data acquisition method for leg ulcer volume determination.

Keywords: Imaging, Laser Triangulation, Structured Light, Volume Determination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509
8989 An Exploration of Brand Storytelling in a Video Sharing Social Network

Authors: Charmaine du Plessis

Abstract:

The brand storytelling themes and emotional appeals of three major global brands were analysed by means of visual rhetoric in a digital environment focusing on the ethos communication technique. A well-known framework of five basic brand personality dimensions was used to delineate the analysis. Brand storytelling as a branding technique is becoming increasingly popular, especially since all brands can tell a story to connect and engage with consumers on an emotional level. Social media have changed the way in which brand stories are shared with online consumers, while social video networking sites in particular create an opportunity to share brand stories with a much greater target audience through electronic word of mouth (eWOM). The findings not only confirm three dimensions in the traditional brand personality framework, but can also serve as a heuristic tool for other researchers analyzing brand storytelling in a social video sharing network environment.

Keywords: Communication technique, visual rhetoric, social video sharing network, brand storytelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2204
8988 Modeling Stress-Induced Regulatory Cascades with Artificial Neural Networks

Authors: Maria E. Manioudaki, Panayiota Poirazi

Abstract:

Yeast cells live in a constantly changing environment that requires the continuous adaptation of their genomic program in order to sustain their homeostasis, survive and proliferate. Due to the advancement of high throughput technologies, there is currently a large amount of data such as gene expression, gene deletion and protein-protein interactions for S. Cerevisiae under various environmental conditions. Mining these datasets requires efficient computational methods capable of integrating different types of data, identifying inter-relations between different components and inferring functional groups or 'modules' that shape intracellular processes. This study uses computational methods to delineate some of the mechanisms used by yeast cells to respond to environmental changes. The GRAM algorithm is first used to integrate gene expression data and ChIP-chip data in order to find modules of coexpressed and co-regulated genes as well as the transcription factors (TFs) that regulate these modules. Since transcription factors are themselves transcriptionally regulated, a three-layer regulatory cascade consisting of the TF-regulators, the TFs and the regulated modules is subsequently considered. This three-layer cascade is then modeled quantitatively using artificial neural networks (ANNs) where the input layer corresponds to the expression of the up-stream transcription factors (TF-regulators) and the output layer corresponds to the expression of genes within each module. This work shows that (a) the expression of at least 33 genes over time and for different stress conditions is well predicted by the expression of the top layer transcription factors, including cases in which the effect of up-stream regulators is shifted in time and (b) identifies at least 6 novel regulatory interactions that were not previously associated with stress-induced changes in gene expression. These findings suggest that the combination of gene expression and protein-DNA interaction data with artificial neural networks can successfully model biological pathways and capture quantitative dependencies between distant regulators and downstream genes.

Keywords: gene modules, artificial neural networks, yeast, stress

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
8987 Design of Digital IIR filters with the Advantages of Model Order Reduction Technique

Authors: K.Ramesh, A.Nirmalkumar, G.Gurusamy

Abstract:

In this paper, a new model order reduction phenomenon is introduced at the design stage of linear phase digital IIR filter. The complexity of a system can be reduced by adopting the model order reduction method in their design. In this paper a mixed method of model order reduction is proposed for linear IIR filter. The proposed method employs the advantages of factor division technique to derive the reduced order denominator polynomial and the reduced order numerator is obtained based on the resultant denominator polynomial. The order reduction technique is used to reduce the delay units at the design stage of IIR filter. The validity of the proposed method is illustrated with design example in frequency domain and stability is also examined with help of nyquist plot.

Keywords: Error index (J), Factor division method, IIR filter, Nyquist plot, Order reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
8986 Transmission Performance of Millimeter Wave Multiband OFDM UWB Wireless Signal over Fiber System

Authors: M. Mohamed, X. Zhang, K. Wu, M. Elfituri, A. Legnain

Abstract:

Performance of millimeter-wave (mm-wave) multiband orthogonal frequency division multiplexing (MB-OFDM) ultrawideband (UWB) signal generation using frequency quadrupling technique and transmission over fiber is experimentally investigated. The frequency quadrupling is achived by using only one Mach- Zehnder modulator (MZM) that is biased at maximum transmission (MATB) point. At the output, a frequency quadrupling signal is obtained then sent to a second MZM. This MZM is used for MBOFDM UWB signal modulation. In this work, we demonstrate 30- GHz mm-wave wireless that carries three-bands OFDM UWB signals, and error vector magnitude (EVM) is used to analyze the transmission quality. It is found that our proposed technique leads to an improvement of 3.5 dB in EVM at 40% of local oscillator (LO) modulation with comparison to the technique using two cascaded MZMs biased at minimum transmission (MITB) point.

Keywords: Optical communication, Frequency up-conversion, Mach-Zehnder modulator, millimeter wave generation, radio over fiber

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2576
8985 Development of Prediction Models of Day-Ahead Hourly Building Electricity Consumption and Peak Power Demand Using the Machine Learning Method

Authors: Dalin Si, Azizan Aziz, Bertrand Lasternas

Abstract:

To encourage building owners to purchase electricity at the wholesale market and reduce building peak demand, this study aims to develop models that predict day-ahead hourly electricity consumption and demand using artificial neural network (ANN) and support vector machine (SVM). All prediction models are built in Python, with tool Scikit-learn and Pybrain. The input data for both consumption and demand prediction are time stamp, outdoor dry bulb temperature, relative humidity, air handling unit (AHU), supply air temperature and solar radiation. Solar radiation, which is unavailable a day-ahead, is predicted at first, and then this estimation is used as an input to predict consumption and demand. Models to predict consumption and demand are trained in both SVM and ANN, and depend on cooling or heating, weekdays or weekends. The results show that ANN is the better option for both consumption and demand prediction. It can achieve 15.50% to 20.03% coefficient of variance of root mean square error (CVRMSE) for consumption prediction and 22.89% to 32.42% CVRMSE for demand prediction, respectively. To conclude, the presented models have potential to help building owners to purchase electricity at the wholesale market, but they are not robust when used in demand response control.

Keywords: Building energy prediction, data mining, demand response, electricity market.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205
8984 Holistic Face Recognition using Multivariate Approximation, Genetic Algorithms and AdaBoost Classifier: Preliminary Results

Authors: C. Villegas-Quezada, J. Climent

Abstract:

Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.

Keywords: AdaBoost Classifier, Holistic Face Recognition, Minimax Multivariate Approximation, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
8983 Analysis of Lead Time Delays in Supply Chain: A Case Study

Authors: Abdel-Aziz M. Mohamed, Nermeen Coutry

Abstract:

Lead time is a critical measure of a supply chain's performance. It impacts both the customer satisfactions as well as the total cost of inventory. This paper presents the result of a study on the analysis of the customer order lead-time for a multinational company. In the study, the lead time was divided into three stages respectively: order entry, order fulfillment, and order delivery. A sample of size 2,425 order lines was extracted from the company's records to use for this study. The sample data entails information regarding customer orders from the time of order entry until order delivery. Data regarding the lead time of each stage for different orders were also provided. Summary statistics on lead time data reveals that about 30% of the orders were delivered later than the scheduled due date. The result of the multiple linear regression analysis technique revealed that component type, logistics parameter, order size and the customer type have significant impacts on lead time. Data analysis on the stages of lead time indicates that stage 2 consumed over 50% of the lead time. Pareto analysis was made to study the reasons for the customer order delay in each stage. Recommendation was given to resolve the problem.

Keywords: Lead time reduction, customer satisfaction, service quality, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6689
8982 Using Pattern Search Methods for Minimizing Clustering Problems

Authors: Parvaneh Shabanzadeh, Malik Hj Abu Hassan, Leong Wah June, Maryam Mohagheghtabar

Abstract:

Clustering is one of an interesting data mining topics that can be applied in many fields. Recently, the problem of cluster analysis is formulated as a problem of nonsmooth, nonconvex optimization, and an algorithm for solving the cluster analysis problem based on nonsmooth optimization techniques is developed. This optimization problem has a number of characteristics that make it challenging: it has many local minimum, the optimization variables can be either continuous or categorical, and there are no exact analytical derivatives. In this study we show how to apply a particular class of optimization methods known as pattern search methods to address these challenges. These methods do not explicitly use derivatives, an important feature that has not been addressed in previous studies. Results of numerical experiments are presented which demonstrate the effectiveness of the proposed method.

Keywords: Clustering functions, Non-smooth Optimization, Nonconvex Optimization, Pattern Search Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
8981 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: Software quality, fuzzy logic, perceptron, prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1180
8980 Fabrication of Wearable Antennas through Thermal Deposition

Authors: Jeff Letcher, Dennis Tierney, Haider Raad

Abstract:

Antennas are devices for transmitting and/or receiving signals which make them a necessary component of any wireless system. In this paper, a thermal deposition technique is utilized as a method to fabricate antenna structures on substrates. Thin-film deposition is achieved by evaporating a source material (metals in our case) in a vacuum which allows vapor particles to travel directly to the target substrate which is encased with a mask that outlines the desired structure. The material then condenses back to solid state. This method is used in comparison to screen printing, chemical etching, and ink jet printing to indicate advantages and disadvantages to the method. The antenna created undergoes various testing of frequency ranges, conductivity, and a series of flexing to indicate the effectiveness of the thermal deposition technique. A single band antenna that is operated at 2.45 GHz intended for wearable and flexible applications was successfully fabricated through this method and tested. It is concluded that thermal deposition presents a feasible technique of producing such antennas.

Keywords: Thermal deposition, wearable antennas, Bluetooth technology, flexible electronics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1379
8979 Experimental Evaluation of Methane Adsorptionon Granular Activated Carbon (GAC) and Determination of Model Isotherm

Authors: M. Delavar, A.A. Ghoreyshi, M. Jahanshahi, M. Irannejad

Abstract:

This study investigates the capacity of granular activated carbon (GAC) for the storage of methane through the equilibrium adsorption. An experimental apparatus consist of a dual adsorption vessel was set up for the measurement of equilibrium adsorption of methane on GAC using volumetric technique (pressure decay). Experimental isotherms of methane adsorption were determined by the measurement of equilibrium uptake of methane in different pressures (0-50 bar) and temperatures (285.15-328.15°K). The experimental data was fitted to Freundlich and Langmuir equations to determine the model isotherm. The results show that the experimental data is equally well fitted by the both model isotherms. Using the experimental data obtained in different temperatures the isosteric heat of methane adsorption was also calculated by the Clausius-Clapeyron equation from the Sips isotherm model. Results of isosteric heat of adsorption show that decreasing temperature or increasing methane uptake by GAC decrease the isosteric heat of methane adsorption.

Keywords: Methane adsorption, Activated carbon, Modelisotherm, Isosteric heat

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
8978 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: Big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2143
8977 An Improvement of Multi-Label Image Classification Method Based on Histogram of Oriented Gradient

Authors: Ziad Abdallah, Mohamad Oueidat, Ali El-Zaart

Abstract:

Image Multi-label Classification (IMC) assigns a label or a set of labels to an image. The big demand for image annotation and archiving in the web attracts the researchers to develop many algorithms for this application domain. The existing techniques for IMC have two drawbacks: The description of the elementary characteristics from the image and the correlation between labels are not taken into account. In this paper, we present an algorithm (MIML-HOGLPP), which simultaneously handles these limitations. The algorithm uses the histogram of gradients as feature descriptor. It applies the Label Priority Power-set as multi-label transformation to solve the problem of label correlation. The experiment shows that the results of MIML-HOGLPP are better in terms of some of the evaluation metrics comparing with the two existing techniques.

Keywords: Data mining, information retrieval system, multi-label, problem transformation, histogram of gradients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1315
8976 Development of a Technology Assessment Model by Patents and Customers' Review Data

Authors: Kisik Song, Sungjoo Lee

Abstract:

Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.

Keywords: Technology assessment, patents, citation information, opinion mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 992
8975 Efficiency Evaluation of E-Commerce Websites

Authors: A. K. Abd El-Aleem, W. F. Abd El-wahed, N. A. Ismail, F. A. Torkey

Abstract:

This study suggests a model of a new set of evaluation criteria that will be used to measure the efficiency of real-world E-commerce websites. Evaluation criteria include design, usability and performance for websites, the Data Envelopment Analysis (DEA) technique has been used to measure the websites efficiency. An efficient Web site is defined as a site that generates the most outputs, using the smallest amount of inputs. Inputs refer to measurements representing the amount of effort required to build, maintain and perform the site. Output is amount of traffic the site generates. These outputs are measured as the average number of daily hits and the average number of daily unique visitors.

Keywords: Data Envelopment Analysis, E-commerce, Efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4105
8974 Enhancing Temporal Extrapolation of Wind Speed Using a Hybrid Technique: A Case Study in West Coast of Denmark

Authors: B. Elshafei, X. Mao

Abstract:

The demand for renewable energy is significantly increasing, major investments are being supplied to the wind power generation industry as a leading source of clean energy. The wind energy sector is entirely dependable and driven by the prediction of wind speed, which by the nature of wind is very stochastic and widely random. This s0tudy employs deep multi-fidelity Gaussian process regression, used to predict wind speeds for medium term time horizons. Data of the RUNE experiment in the west coast of Denmark were provided by the Technical University of Denmark, which represent the wind speed across the study area from the period between December 2015 and March 2016. The study aims to investigate the effect of pre-processing the data by denoising the signal using empirical wavelet transform (EWT) and engaging the vector components of wind speed to increase the number of input data layers for data fusion using deep multi-fidelity Gaussian process regression (GPR). The outcomes were compared using root mean square error (RMSE) and the results demonstrated a significant increase in the accuracy of predictions which demonstrated that using vector components of the wind speed as additional predictors exhibits more accurate predictions than strategies that ignore them, reflecting the importance of the inclusion of all sub data and pre-processing signals for wind speed forecasting models.

Keywords: Data fusion, Gaussian process regression, signal denoise, temporal extrapolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 501
8973 A New Fast Skin Color Detection Technique

Authors: Tarek M. Mahmoud

Abstract:

Skin color can provide a useful and robust cue for human-related image analysis, such as face detection, pornographic image filtering, hand detection and tracking, people retrieval in databases and Internet, etc. The major problem of such kinds of skin color detection algorithms is that it is time consuming and hence cannot be applied to a real time system. To overcome this problem, we introduce a new fast technique for skin detection which can be applied in a real time system. In this technique, instead of testing each image pixel to label it as skin or non-skin (as in classic techniques), we skip a set of pixels. The reason of the skipping process is the high probability that neighbors of the skin color pixels are also skin pixels, especially in adult images and vise versa. The proposed method can rapidly detect skin and non-skin color pixels, which in turn dramatically reduce the CPU time required for the protection process. Since many fast detection techniques are based on image resizing, we apply our proposed pixel skipping technique with image resizing to obtain better results. The performance evaluation of the proposed skipping and hybrid techniques in terms of the measured CPU time is presented. Experimental results demonstrate that the proposed methods achieve better result than the relevant classic method.

Keywords: Adult images filtering, image resizing, skin color detection, YcbCr color space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4004
8972 Non-parametric Linear Technique for Measuring the Efficiency of Winter Road Maintenance in the Arctic Area

Authors: Mahshid Hatamzad, Geanette Polanco

Abstract:

Improving the performance of Winter Road Maintenance (WRM) can increase the traffic safety and reduce the cost as well as environmental impacts. This study evaluates the efficiency of WRM technique, named salting, in the Arctic area by using Data Envelopment Analysis (DEA), which is a non-parametric linear method to measure the efficiencies of decision-making units (DMUs) based on handling multiple inputs and multiple outputs at the same time that their associated weights are not known. Here, roads are considered as DMUs for which the efficiency must be determined. The three input variables considered are traffic flow, road area and WRM cost. In addition, the two output variables included are level of safety in the roads and environment impacts resulted from WRM, which is also considered as an uncontrollable factor in the second scenario. The results show the performance of DMUs from the most efficient WRM to the inefficient/least efficient one and this information provides decision makers with technical support and the required suggested improvements for inefficient WRM, in order to achieve a cost-effective WRM and a safe road transportation during wintertime in the Arctic areas.

Keywords: DEA, environmental impacts, risk and safety, WRM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 554