Search results for: search.
459 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: Binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731458 In Search of Innovation: Exploring the Dynamics of Innovation
Authors: Michal Lysek, Mike Danilovic, Jasmine Lihua Liu
Abstract:
HMS Industrial Networks AB has been recognized as one of the most innovative companies in the industrial communication industry worldwide. The creation of their Anybus innovation during the 1990s contributed considerably to the company’s success. From inception, HMS’ employees were innovating for the purpose of creating new business (the creation phase). After the Anybus innovation, they began the process of internationalization (the commercialization phase), which in turn led them to concentrate on cost reduction, product quality, delivery precision, operational efficiency, and increasing growth (the growth phase). As a result of this transformation, performing new radical innovations have become more complicated. The purpose of our research was to explore the dynamics of innovation at HMS from the aspect of key actors, activities, and events, over the three phases, in order to understand what led to the creation of their Anybus innovation, and why it has become increasingly challenging for HMS to create new radical innovations for the future. Our research methodology was based on a longitudinal, retrospective study from the inception of HMS in 1988 to 2014, a single case study inspired by the grounded theory approach. We conducted 47 interviews and collected 1 024 historical documents for our research. Our analysis has revealed that HMS’ success in creating the Anybus, and developing a successful business around the innovation, was based on three main capabilities – cultivating customer relations on different managerial and organizational levels, inspiring business relations, and balancing complementary human assets for the purpose of business creation. The success of HMS has turned the management’s attention away from past activities of key actors, of their behavior, and how they influenced and stimulated the creation of radical innovations. Nowadays, they are rhetorically focusing on creativity and innovation. All the while, their real actions put emphasis on growth, cost reduction, product quality, delivery precision, operational efficiency, and moneymaking. In the process of becoming an international company, HMS gradually refocused. In so doing they became profitable and successful, but they also forgot what made them innovative in the first place. Fortunately, HMS’ management has come to realize that this is the case and they are now in search of recapturing innovation once again. Our analysis indicates that HMS’ management is facing several barriers to innovation related path dependency and other lock-in phenomena. HMS’ management has been captured, trapped in their mindset and actions, by the success of the past. But now their future has to be secured, and they have come to realize that moneymaking is not everything. In recent years, HMS’ management have begun to search for innovation once more, in order to recapture their past capabilities for creating radical innovations. In order to unlock their managerial perceptions of customer needs and their counter-innovation driven activities and events, to utilize the full potential of their employees and capture the innovation opportunity for the future.Keywords: Barriers to innovation, dynamics of innovation, in search of excellence and innovation, radical innovation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3061457 Information Retrieval: A Comparative Study of Textual Indexing Using an Oriented Object Database (db4o) and the Inverted File
Authors: Mohammed Erritali
Abstract:
The growth in the volume of text data such as books and articles in libraries for centuries has imposed to establish effective mechanisms to locate them. Early techniques such as abstraction, indexing and the use of classification categories have marked the birth of a new field of research called "Information Retrieval". Information Retrieval (IR) can be defined as the task of defining models and systems whose purpose is to facilitate access to a set of documents in electronic form (corpus) to allow a user to find the relevant ones for him, that is to say, the contents which matches with the information needs of the user. Most of the models of information retrieval use a specific data structure to index a corpus which is called "inverted file" or "reverse index". This inverted file collects information on all terms over the corpus documents specifying the identifiers of documents that contain the term in question, the frequency of each term in the documents of the corpus, the positions of the occurrences of the word... In this paper we use an oriented object database (db4o) instead of the inverted file, that is to say, instead to search a term in the inverted file, we will search it in the db4o database. The purpose of this work is to make a comparative study to see if the oriented object databases may be competing for the inverse index in terms of access speed and resource consumption using a large volume of data.
Keywords: Information Retrieval, indexation, oriented object database (db4o), inverted file.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734456 A Distributed Cognition Framework to Compare E-Commerce Websites Using Data Envelopment Analysis
Authors: C. lo Storto
Abstract:
This paper presents an approach based on the adoption of a distributed cognition framework and a non parametric multicriteria evaluation methodology (DEA) designed specifically to compare e-commerce websites from the consumer/user viewpoint. In particular, the framework considers a website relative efficiency as a measure of its quality and usability. A website is modelled as a black box capable to provide the consumer/user with a set of functionalities. When the consumer/user interacts with the website to perform a task, he/she is involved in a cognitive activity, sustaining a cognitive cost to search, interpret and process information, and experiencing a sense of satisfaction. The degree of ambiguity and uncertainty he/she perceives and the needed search time determine the effort size – and, henceforth, the cognitive cost amount – he/she has to sustain to perform his/her task. On the contrary, task performing and result achievement induce a sense of gratification, satisfaction and usefulness. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 40 websites of businesses performing electronic commerce in the information technology market. A questionnaire to collect subjective judgements for the websites in the sample was purposely designed and administered to 85 university students enrolled in computer science and information systems engineering undergraduate courses.Keywords: Website, e-commerce, DEA, distributed cognition, evaluation, comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705455 An Activity Based Trajectory Search Approach
Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar
Abstract:
With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.
Keywords: Location-based recommendation, map-reduce, recommendation system, trajectory search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 979454 The Importance of Zenithal Lighting Systems for Natural Light Gains and for Local Energy Generation in Brazil
Authors: Ana Paula Esteves, Diego S. Caetano, Louise L. B. Lomardo
Abstract:
This paper presents an approach on the advantages of using adequate coverage in the zenithal lighting typology in various areas of architectural production, while at the same time to encourage to the design concerns inherent in this choice of roofing in Brazil. Understanding that sustainability needs to cover several aspects, a roofing system such as zenithal lighting system can contribute to the provision of better quality natural light for the interior of the building, which is related to the good health and welfare; it will also be able to contribute for the sustainable aspects and environmental needs, when it allows the generation of energy in semitransparent or opacity photovoltaic solutions and economize the artificial lightning. When the energy balance in the building is positive, that is, when the building generates more energy than it consumes, it may fit into the Net Zero Energy Building concept. The zenithal lighting systems could be an important ally in Brazil, when solved the burden of heat gains, participate in the set of pro-efficiency actions in search of "zero energy buildings". The paper presents comparative three cases of buildings that have used this feature in search of better environmental performance, both in light comfort and sustainability as a whole. Two of these buildings are examples in Europe: the Notley Green School in the UK and the Isofóton factory in Spain. The third building with these principles of shed´s roof is located in Brazil: the Ipel´s factory in São Paulo.
Keywords: Natural lightning, net zero energy building, sheds, semi-transparent photovoltaics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1037453 Exercise and Cognitive Function: Time Course of the Effects
Authors: Simon B. Cooper, Stephan Bandelow, Maria L. Nute, John G. Morris, Mary E. Nevill
Abstract:
Previous research has indicated a variable effect of exercise on adolescents’ cognitive function. However, comparisons between studies are difficult to make due to differences in: the mode, intensity and duration of exercise employed; the components of cognitive function measured (and the tests used to assess them); and the timing of the cognitive function tests in relation to the exercise. Therefore, the aim of the present study was to assess the time course (10 and 60min post-exercise) of the effects of 15min intermittent exercise on cognitive function in adolescents. 45 adolescents were recruited to participate in the study and completed two main trials (exercise and resting) in a counterbalanced crossover design. Participants completed 15min of intermittent exercise (in cycles of 1 min exercise, 30s rest). A battery of computer based cognitive function tests (Stroop test, Sternberg paradigm and visual search test) were completed 30 min pre- and 10 and 60min post-exercise (to assess attention, working memory and perception respectively).The findings of the present study indicate that on the baseline level of the Stroop test, 10min following exercise response times were slower than at any other time point on either trial (trial by session time interaction, p = 0.0308). However, this slowing of responses also tended to produce enhanced accuracy 10min post-exercise on the baseline level of the Stroop test (trial by session time interaction, p = 0.0780). Similarly, on the complex level of the visual search test there was a slowing of response times 10 min post-exercise (trial by session time interaction, p = 0.0199). However, this was not coupled with an improvement in accuracy (trial by session time interaction, p = 0.2349). The mid-morning bout of exercise did not affect response times or accuracy across the morning on the Sternberg paradigm. In conclusion, the findings of the present study suggest an equivocal effect of exercise on adolescents' cognitive function. The mid-morning bout of exercise appears to cause a speed-accuracy trade off immediately following exercise on the Stroop test (participants become slower but more accurate), whilst slowing response times on the visual search test and having no effect on performance on the Sternberg paradigm. Furthermore, this work highlights the importance of the timing of the cognitive function tests relative to the exercise and the components of cognitive function examined in future studies.
Keywords: Adolescents, cognitive function, exercise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3137452 Dynamic Bayesian Networks Modeling for Inferring Genetic Regulatory Networks by Search Strategy: Comparison between Greedy Hill Climbing and MCMC Methods
Authors: Huihai Wu, Xiaohui Liu
Abstract:
Using Dynamic Bayesian Networks (DBN) to model genetic regulatory networks from gene expression data is one of the major paradigms for inferring the interactions among genes. Averaging a collection of models for predicting network is desired, rather than relying on a single high scoring model. In this paper, two kinds of model searching approaches are compared, which are Greedy hill-climbing Search with Restarts (GSR) and Markov Chain Monte Carlo (MCMC) methods. The GSR is preferred in many papers, but there is no such comparison study about which one is better for DBN models. Different types of experiments have been carried out to try to give a benchmark test to these approaches. Our experimental results demonstrated that on average the MCMC methods outperform the GSR in accuracy of predicted network, and having the comparable performance in time efficiency. By proposing the different variations of MCMC and employing simulated annealing strategy, the MCMC methods become more efficient and stable. Apart from comparisons between these approaches, another objective of this study is to investigate the feasibility of using DBN modeling approaches for inferring gene networks from few snapshots of high dimensional gene profiles. Through synthetic data experiments as well as systematic data experiments, the experimental results revealed how the performances of these approaches can be influenced as the target gene network varies in the network size, data size, as well as system complexity.
Keywords: Genetic regulatory network, Dynamic Bayesian network, GSR, MCMC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885451 Realistic Simulation Methodology in Brazil’s New Medical Education Curriculum: Potentialities
Authors: Cleto J. Sauer Jr
Abstract:
Introduction: Brazil’s new national curriculum guidelines (NCG) for medical education were published in 2014, presenting active learning methodologies as a cornerstone. Simulation was initially applied for aviation pilots’ training and is currently applied in health sciences. The high-fidelity simulator replicates human body anatomy in detail, also reproducing physiological functions and its use is increasing in medical schools. Realistic Simulation (RS) has pedagogical aspects that are aligned with Brazil’s NCG teaching concepts. The main objective of this study is to carry on a narrative review on RS’s aspects that are aligned with Brazil’s new NCG teaching concepts. Methodology: A narrative review was conducted, with search in three databases (PubMed, Embase and BVS) of studies published between 2010 and 2020. Results: After systematized search, 49 studies were selected and divided into four thematic groups. RS is aligned with new Brazilian medical curriculum as it is an active learning methodology, providing greater patient safety, uniform teaching, and student's emotional skills enhancement. RS is based on reflective learning, a teaching concept developed for adult’s education. Conclusion: RS is a methodology aligned with NCG teaching concepts and has potential to assist in the implementation of new Brazilian medical school’s curriculum. It is an immersive and interactive methodology, which provides reflective learning in a safe environment for students and patients.
Keywords: Curriculum, high-fidelity simulator, medical education, realistic simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 568450 An Adaptive Memetic Algorithm With Dynamic Population Management for Designing HIV Multidrug Therapies
Authors: Hassan Zarei, Ali Vahidian Kamyad, Sohrab Effati
Abstract:
In this paper, a mathematical model of human immunodeficiency virus (HIV) is utilized and an optimization problem is proposed, with the final goal of implementing an optimal 900-day structured treatment interruption (STI) protocol. Two type of commonly used drugs in highly active antiretroviral therapy (HAART), reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are considered. In order to solving the proposed optimization problem an adaptive memetic algorithm with population management (AMAPM) is proposed. The AMAPM uses a distance measure to control the diversity of population in genotype space and thus preventing the stagnation and premature convergence. Moreover, the AMAPM uses diversity parameter in phenotype space to dynamically set the population size and the number of crossovers during the search process. Three crossover operators diversify the population, simultaneously. The progresses of crossover operators are utilized to set the number of each crossover per generation. In order to escaping the local optima and introducing the new search directions toward the global optima, two local searchers assist the evolutionary process. In contrast to traditional memetic algorithms, the activation of these local searchers is not random and depends on both the diversity parameters in genotype space and phenotype space. The capability of AMAPM in finding optimal solutions compared with three popular metaheurestics is introduced.Keywords: HIV therapy design, memetic algorithms, adaptivealgorithms, nonlinear integer programming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626449 A Propagator Method like Algorithm for Estimation of Multiple Real-Valued Sinusoidal Signal Frequencies
Authors: Sambit Prasad Kar, P.Palanisamy
Abstract:
In this paper a novel method for multiple one dimensional real valued sinusoidal signal frequency estimation in the presence of additive Gaussian noise is postulated. A computationally simple frequency estimation method with efficient statistical performance is attractive in many array signal processing applications. The prime focus of this paper is to combine the subspace-based technique and a simple peak search approach. This paper presents a variant of the Propagator Method (PM), where a collaborative approach of SUMWE and Propagator method is applied in order to estimate the multiple real valued sine wave frequencies. A new data model is proposed, which gives the dimension of the signal subspace is equal to the number of frequencies present in the observation. But, the signal subspace dimension is twice the number of frequencies in the conventional MUSIC method for estimating frequencies of real-valued sinusoidal signal. The statistical analysis of the proposed method is studied, and the explicit expression of asymptotic (large-sample) mean-squared-error (MSE) or variance of the estimation error is derived. The performance of the method is demonstrated, and the theoretical analysis is substantiated through numerical examples. The proposed method can achieve sustainable high estimation accuracy and frequency resolution at a lower SNR, which is verified by simulation by comparing with conventional MUSIC, ESPRIT and Propagator Method.
Keywords: Frequency estimation, peak search, subspace-based method without eigen decomposition, quadratic convex function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731448 Hybrid Adaptive Modeling to Enhance Robustness of Real-Time Optimization
Authors: Hussain Syed Asad, Richard Kwok Kit Yuen, Gongsheng Huang
Abstract:
Real-time optimization has been considered an effective approach for improving energy efficient operation of heating, ventilation, and air-conditioning (HVAC) systems. In model-based real-time optimization, model mismatches cannot be avoided. When model mismatches are significant, the performance of the real-time optimization will be impaired and hence the expected energy saving will be reduced. In this paper, the model mismatches for chiller plant on real-time optimization are considered. In the real-time optimization of the chiller plant, simplified semi-physical or grey box model of chiller is always used, which should be identified using available operation data. To overcome the model mismatches associated with the chiller model, hybrid Genetic Algorithms (HGAs) method is used for online real-time training of the chiller model. HGAs combines Genetic Algorithms (GAs) method (for global search) and traditional optimization method (i.e. faster and more efficient for local search) to avoid conventional hit and trial process of GAs. The identification of model parameters is synthesized as an optimization problem; and the objective function is the Least Square Error between the output from the model and the actual output from the chiller plant. A case study is used to illustrate the implementation of the proposed method. It has been shown that the proposed approach is able to provide reliability in decision making, enhance the robustness of the real-time optimization strategy and improve on energy performance.
Keywords: Energy performance, hybrid adaptive modeling, hybrid genetic algorithms, real-time optimization, heating, ventilation, and air-conditioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1139447 Spatial Query Localization Method in Limited Reference Point Environment
Authors: Victor Krebss
Abstract:
Task of object localization is one of the major challenges in creating intelligent transportation. Unfortunately, in densely built-up urban areas, localization based on GPS only produces a large error, or simply becomes impossible. New opportunities arise for the localization due to the rapidly emerging concept of a wireless ad-hoc network. Such network, allows estimating potential distance between these objects measuring received signal level and construct a graph of distances in which nodes are the localization objects, and edges - estimates of the distances between pairs of nodes. Due to the known coordinates of individual nodes (anchors), it is possible to determine the location of all (or part) of the remaining nodes of the graph. Moreover, road map, available in digital format can provide localization routines with valuable additional information to narrow node location search. However, despite abundance of well-known algorithms for solving the problem of localization and significant research efforts, there are still many issues that currently are addressed only partially. In this paper, we propose localization approach based on the graph mapped distances on the digital road map data basis. In fact, problem is reduced to distance graph embedding into the graph representing area geo location data. It makes possible to localize objects, in some cases even if only one reference point is available. We propose simple embedding algorithm and sample implementation as spatial queries over sensor network data stored in spatial database, allowing employing effectively spatial indexing, optimized spatial search routines and geometry functions.Keywords: Intelligent Transportation System, Sensor Network, Localization, Spatial Query, GIS, Graph Embedding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535446 Preliminary Roadway Alignment Design: A Spatial-Data Optimization Approach
Authors: Y. Abdelrazig, R. Moses
Abstract:
Roadway planning and design is a very complex process involving five key phases before a project is completed; planning, project development, final design, right-of-way, and construction. The planning phase for a new roadway transportation project is a very critical phase as it greatly affects all latter phases of the project. A location study is usually performed during the preliminary planning phase in a new roadway project. The objective of the location study is to develop alignment alternatives that are cost efficient considering land acquisition and construction costs. This paper describes a methodology to develop optimal preliminary roadway alignments utilizing spatial-data. Four optimization criteria are taken into consideration; roadway length, land cost, land slope, and environmental impacts. The basic concept of the methodology is to convert the proposed project area into a grid, which represents the search space for an optimal alignment. The aforementioned optimization criteria are represented in each of the grid’s cells. A spatial-data optimization technique is utilized to find the optimal alignment in the search space based on the four optimization criteria. Two case studies for new roadway projects in Duval County in the State of Florida are presented to illustrate the methodology. The optimization output alignments are compared to the proposed Florida Department of Transportation (FDOT) alignments. The comparison is based on right-of-way costs for the alignments. For both case studies, the right-of-way costs for the developed optimal alignments were found to be significantly lower than the FDOT alignments.Keywords: Optimization, planning, roadway alignment, FDOT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2033445 Automated Fact-Checking By Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state of the art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study presents a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive and authoritative data; 2) developing a search function to automatically select relevant, new and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that: 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graph in Wikidata to dynamically augment the representations of claims and references without introducing too much noises; II) exploring semantic relations in claims and references to further enhance fact-checking.
Keywords: Fact checking, claim verification, Deep Learning, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 80444 Mobile App versus Website: A Comparative Eye-Tracking Case Study of Topshop
Authors: Zofija Tupikovskaja-Omovie, David Tyler, Sam Dhanapala, Steve Hayes
Abstract:
The UK is leading in online retail and mobile adoption. However, there is a dearth of information relating to mobile apparel retail, and developing an understanding about consumer browsing and purchase behaviour in m-retail channel would provide apparel marketers, mobile website and app developers with the necessary understanding of consumers’ needs. Despite the rapid growth of mobile retail businesses, no published study has examined shopping behaviour on fashion mobile apps and websites. A mixed method approach helped to understand why fashion consumers prefer websites on smartphones, when diverse mobile apps are also available. The following research methods were employed: survey, eye-tracking experiments, observation, and interview with retrospective think aloud. The mobile gaze tracking device by SensoMotoric Instruments was used to understand frustrations in navigation and other issues facing consumers in mobile channel. This method helped to validate and compliment other traditional user-testing approaches in order to optimize user experience and enhance the development of mobile retail channel. The study involved eight participants - females aged 18 to 35 years old, who are existing mobile shoppers. The participants used the Topshop mobile app and website on a smart phone to complete a task according to a specified scenario leading to a purchase. The comparative study was based on: duration and time spent at different stages of the shopping journey, number of steps involved and product pages visited, search approaches used, layout and visual clues, as well as consumer perceptions and expectations. The results from the data analysis show significant differences in consumer behaviour when using a mobile app or website on a smart phone. Moreover, two types of problems were identified, namely technical issues and human errors. Having a mobile app does not guarantee success in satisfying mobile fashion consumers. The differences in the layout and visual clues seem to influence the overall shopping experience on a smart phone. The layout of search results on the website was different from the mobile app. Therefore, participants, in most cases, behaved differently on different platforms. The number of product pages visited on the mobile app was triple the number visited on the website due to a limited visibility of products in the search results. Although, the data on traffic trends held by retailers to date, including retail sector breakdowns for visits and views, data on device splits and duration, might seem a valuable source of information, it cannot explain why consumers visit many product pages, stay longer on the website or mobile app, or abandon the basket. A comprehensive list of pros and cons was developed by highlighting issues for website and mobile app, and recommendations provided. The findings suggest that fashion retailers need to be aware of actual consumers’ behaviour on the mobile channel and their expectations in order to offer a seamless shopping experience. Added to which is the challenge of retaining existing and acquiring new customers. There seem to be differences in the way fashion consumers search and shop on mobile, which need to be explored in further studies.Keywords: Consumer behaviour, eye-tracking technology, fashion retail, mobile app, m-retail, smart phones, Topshop, user experience, website.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2364443 GPS Navigator for Blind Walking in a Campus
Authors: Rangsipan Marukatat, Pongmanat Manaspaibool, Benjawan Khaiprapay, Pornpimon Plienjai
Abstract:
We developed a GPS-based navigation device for the blind, with audio guidance in Thai language. The device is composed of simple and inexpensive hardware components. Its user interface is quite simple. It determines optimal routes to various landmarks in our university campus by using heuristic search for the next waypoints. We tested the device and made note of its limitations and possible extensions.Keywords: Blind, global positioning system (GPS), navigation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2450442 Multi-Scale Gabor Feature Based Eye Localization
Authors: Sanghoon Kim, Sun-Tae Chung, Souhwan Jung, Dusik Oh, Jaemin Kim, Seongwon Cho
Abstract:
Eye localization is necessary for face recognition and related application areas. Most of eye localization algorithms reported so far still need to be improved about precision and computational time for successful applications. In this paper, we propose an eye location method based on multi-scale Gabor feature vectors, which is more robust with respect to initial points. The eye localization based on Gabor feature vectors first needs to constructs an Eye Model Bunch for each eye (left or right eye) which consists of n Gabor jets and average eye coordinates of each eyes obtained from n model face images, and then tries to localize eyes in an incoming face image by utilizing the fact that the true eye coordinates is most likely to be very close to the position where the Gabor jet will have the best Gabor jet similarity matching with a Gabor jet in the Eye Model Bunch. Similar ideas have been already proposed in such as EBGM (Elastic Bunch Graph Matching). However, the method used in EBGM is known to be not robust with respect to initial values and may need extensive search range for achieving the required performance, but extensive search ranges will cause much more computational burden. In this paper, we propose a multi-scale approach with a little increased computational burden where one first tries to localize eyes based on Gabor feature vectors in a coarse face image obtained from down sampling of the original face image, and then localize eyes based on Gabor feature vectors in the original resolution face image by using the eye coordinates localized in the coarse scaled image as initial points. Several experiments and comparisons with other eye localization methods reported in the other papers show the efficiency of our proposed method.Keywords: Eye Localization, Gabor features, Multi-scale, Gabor wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820441 Application of HSA and GA in Optimal Placement of FACTS Devices Considering Voltage Stability and Losses
Authors: A. Parizad, A. Khazali, M. Kalantar
Abstract:
Voltage collapse is instability of heavily loaded electric power systems that cause to declining voltages and blackout. Power systems are predicated to become more heavily loaded in the future decade as the demand for electric power rises while economic and environmental concerns limit the construction of new transmission and generation capacity. Heavily loaded power systems are closer to their stability limits and voltage collapse blackouts will occur if suitable monitoring and control measures are not taken. To control transmission lines, it can be used from FACTS devices. In this paper Harmony search algorithm (HSA) and Genetic Algorithm (GA) have applied to determine optimal location of FACTS devices in a power system to improve power system stability. Three types of FACTS devices (TCPAT, UPFS, and SVC) have been introduced. Bus under voltage has been solved by controlling reactive power of shunt compensator. Also a combined series-shunt compensators has been also used to control transmission power flow and bus voltage simultaneously. Different scenarios have been considered. First TCPAT, UPFS, and SVC are placed solely in transmission lines and indices have been calculated. Then two types of above controller try to improve parameters randomly. The last scenario tries to make better voltage stability index and losses by implementation of three types controller simultaneously. These scenarios are executed on typical 34-bus test system and yields efficiency in improvement of voltage profile and reduction of power losses; it also may permit an increase in power transfer capacity, maximum loading, and voltage stability margin.Keywords: FACTS Devices, Voltage Stability Index, optimal location, Heuristic methods, Harmony search, Genetic Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2010440 Universal Metadata Definition
Authors: Mohib ur Rehman, Mohammad Haseeb Anwer, Nadeem Iftikhar
Abstract:
The need to have standards has always been a priority of all the disciplines in the world. Today, standards such as XML and USB are trying to create a universal interface for their respective areas. The information regarding every family in the discipline addressed, must have a lot in common, known as Metadata. A lot of work has been done in specific domains such as IEEE LOM and MPEG-7 but they do not appeal to the universality of creating Metadata for all entities, where we take an entity (object) as, not restricted to Software Terms. This paper tries to address this problem of universal Metadata Definition which may lead to increase in precision of search.Keywords: Metadata, Standard, Universal, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729439 Representation of Power System for Electromagnetic Transient Calculation
Authors: P. Sowa
Abstract:
The new idea of analyze of power system failure with use of artificial neural network is proposed. An analysis of the possibility of simulating phenomena accompanying system faults and restitution is described. It was indicated that the universal model for the simulation of phenomena in whole analyzed range does not exist. The main classic method of search of optimal structure and parameter identification are described shortly. The example with results of calculation is shown.Keywords: Dynamic equivalents, Network reduction, Neural networks, Power system analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1896438 Interpolation of Geofield Parameters
Authors: A. Pashayev, C. Ardil, R. Sadiqov
Abstract:
Various methods of geofield parameters restoration (by algebraic polynoms; filters; rational fractions; interpolation splines; geostatistical methods – kriging; search methods of nearest points – inverse distance, minimum curvature, local – polynomial interpolation; neural networks) have been analyzed and some possible mistakes arising during geofield surface modeling have been presented.
Keywords: interpolation methods, geofield parameters, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1703437 The Use of Software and Internet Search Engines to Develop the Encoding and Decoding Skills of a Dyslexic Learner: A Case Study
Authors: Rabih Joseph Nabhan
Abstract:
This case study explores the impact of two major computer software programs Learn to Speak English and Learn English Spelling and Pronunciation, and some Internet search engines such as Google on mending the decoding and spelling deficiency of Simon X, a dyslexic student. The improvement in decoding and spelling may result in better reading comprehension and composition writing. Some computer programs and Internet materials can help regain the missing awareness and consequently restore his self-confidence and self-esteem. In addition, this study provides a systematic plan comprising a set of activities (four computer programs and Internet materials) which address the problem from the lowest to the highest levels of phoneme and phonological awareness. Four methods of data collection (accounts, observations, published tests, and interviews) create the triangulation to validly and reliably collect data before the plan, during the plan, and after the plan. The data collected are analyzed quantitatively and qualitatively. Sometimes the analysis is either quantitative or qualitative, and some other times a combination of both. Tables and figures are utilized to provide a clear and uncomplicated illustration of some data. The improvement in the decoding, spelling, reading comprehension, and composition writing skills that occurred is proved through the use of authentic materials performed by the student under study. Such materials are a comparison between two sample passages written by the learner before and after the plan, a genuine computer chat conversation, and the scores of the academic year that followed the execution of the plan. Based on these results, the researcher recommends further studies on other Lebanese dyslexic learners using the computer to mend their language problem in order to design and make a most reliable software program that can address this disability more efficiently and successfully.
Keywords: Analysis, awareness, dyslexic, software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645436 Experimenting the Influence of Input Modality on Involvement Load Hypothesis
Authors: Mohammad Hassanzadeh
Abstract:
As far as incidental vocabulary learning is concerned, the basic contention of the Involvement Load Hypothesis (ILH) is that retention of unfamiliar words is, generally, conditional upon the degree of involvement in processing them. This study examined input modality and incidental vocabulary uptake in a task-induced setting whereby three variously loaded task types (marginal glosses, fill-in-task, and sentence-writing) were alternately assigned to one group of students at Allameh Tabataba’i University (n=2l) during six classroom sessions. While one round of exposure was comprised of the audiovisual medium (TV talk shows), the second round consisted of textual materials with approximately similar subject matter (reading texts). In both conditions, however, the tasks were equivalent to one another. Taken together, the study pursued the dual objectives of establishing a litmus test for the ILH and its proposed values of ‘need’, ‘search’ and ‘evaluation’ in the first place. Secondly, it sought to bring to light the superiority issue of exposure to audiovisual input versus the written input as far as the incorporation of tasks is concerned. At the end of each treatment session, a vocabulary active recall test was administered to measure their incidental gains. Running a one-way analysis of variance revealed that the audiovisual intervention yielded higher gains than the written version even when differing tasks were included. Meanwhile, task 'three' (sentence-writing) turned out the most efficient in tapping learners' active recall of the target vocabulary items. In addition to shedding light on the superiority of audiovisual input over the written input when circumstances are relatively held constant, this study for the most part, did support the underlying tenets of ILH.
Keywords: Evaluation, incidental vocabulary learning, input mode, involvement load hypothesis, need, search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152435 Storing OWL Ontologies in SQL Relational Databases
Authors: Irina Astrova, Nahum Korda, Ahto Kalja
Abstract:
Relational databases are often used as a basis for persistent storage of ontologies to facilitate rapid operations such as search and retrieval, and to utilize the benefits of relational databases management systems such as transaction management, security and integrity control. On the other hand, there appear more and more OWL files that contain ontologies. Therefore, this paper proposes to extract ontologies from OWL files and then store them in relational databases. A prerequisite for this storing is transformation of ontologies to relational databases, which is the purpose of this paper.Keywords: Ontologies, relational databases, SQL, and OWL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5213434 A Descent-projection Method for Solving Monotone Structured Variational Inequalities
Authors: Min Sun, Zhenyu Liu
Abstract:
In this paper, a new descent-projection method with a new search direction for monotone structured variational inequalities is proposed. The method is simple, which needs only projections and some function evaluations, so its computational load is very tiny. Under mild conditions on the problem-s data, the method is proved to converges globally. Some preliminary computational results are also reported to illustrate the efficiency of the method.Keywords: variational inequalities, monotone function, global convergence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292433 Study on the Self-Location Estimate by the Evolutional Triangle Similarity Matching Using Artificial Bee Colony Algorithm
Authors: Yuji Kageyama, Shin Nagata, Tatsuya Takino, Izuru Nomura, Hiroyuki Kamata
Abstract:
In previous study, technique to estimate a self-location by using a lunar image is proposed.We consider the improvement of the conventional method in consideration of FPGA implementationin this paper. Specifically, we introduce Artificial Bee Colony algorithm for reduction of search time.In addition, we use fixed point arithmetic to enable high-speed operation on FPGA.
Keywords: SLIM, Artificial Bee Colony Algorithm, Location Estimate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1980432 Use of Smartphones in 6th and 7th Grade (Elementary Schools) in Istria: Pilot Study
Authors: Maja Ruzic-Baf, Vedrana Keteles, Andrea Debeljuh
Abstract:
Younger and younger children are now using a smartphone, a device which has become ‘a must have’ and the life of children would be almost ‘unthinkable’ without one. Devices are becoming lighter and lighter but offering an array of options and applications as well as the unavoidable access to the Internet, without which it would be almost unusable. Numerous features such as taking of photographs, listening to music, information search on the Internet, access to social networks, usage of some of the chatting and messaging services, are only some of the numerous features offered by ‘smart’ devices. They have replaced the alarm clock, home phone, camera, tablet and other devices. Their use and possession have become a part of the everyday image of young people. Apart from the positive aspects, the use of smartphones has also some downsides. For instance, free time was usually spent in nature, playing, doing sports or other activities enabling children an adequate psychophysiological growth and development. The greater usage of smartphones during classes to check statuses on social networks, message your friends, play online games, are just some of the possible negative aspects of their application. Considering that the age of the population using smartphones is decreasing and that smartphones are no longer ‘foreign’ to children of pre-school age (smartphones are used at home or in coffee shops or shopping centers while waiting for their parents, playing video games often inappropriate to their age), particular attention must be paid to a very sensitive group, the teenagers who almost never separate from their ‘pets’. This paper is divided into two sections, theoretical and empirical ones. The theoretical section gives an overview of the pros and cons of the usage of smartphones, while the empirical section presents the results of a research conducted in three elementary schools regarding the usage of smartphones and, specifically, their usage during classes, during breaks and to search information on the Internet, check status updates and 'likes’ on the Facebook social network.
Keywords: Education, smartphone, social networks, teenagers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522431 Genetic Algorithms in Hot Steel Rolling for Scale Defect Prediction
Authors: Jarno Haapamäki, Juha Röning
Abstract:
Scale defects are common surface defects in hot steel rolling. The modelling of such defects is problematic and their causes are not straightforward. In this study, we investigated genetic algorithms in search for a mathematical solution to scale formation. For this research, a high-dimensional data set from hot steel rolling process was gathered. The synchronisation of the variables as well as the allocation of the measurements made on the steel strip were solved before the modelling phase.
Keywords: Genetic algorithms, hot strip rolling, knowledge discovery, modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3305430 An Experimental Multi-Agent Robot System for Operating in Hazardous Environments
Authors: Y. J. Huang, J. D. Yu, B. W. Hong, C. H. Tai, T. C. Kuo
Abstract:
In this paper, a multi-agent robot system is presented. The system consists of four robots. The developed robots are able to automatically enter and patrol a harmful environment, such as the building infected with virus or the factory with leaking hazardous gas. Further, every robot is able to perform obstacle avoidance and search for the victims. Several operation modes are designed: remote control, obstacle avoidance, automatic searching, and so on.
Keywords: autonomous robot, field programmable gate array, obstacle avoidance, ultrasonic sensor, wireless communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777