Search results for: user intention identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2248

Search results for: user intention identification

1438 A New Model for Question Answering Systems

Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour

Abstract:

Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).

Keywords: Answer Processing, Classification, QuestionAnswering and Query Reformulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
1437 Identifying Quality Islamic Content in Community Question Answering Sites

Authors: Rabia Bibi, Muhammad Shahzad Faisal, Khalid Iqbal, Atif Inayat

Abstract:

Internet is growing rapidly and new community-based content is added by people every second. With this fast-growing community-based content, if a user requires answers of particular questions, then reviews are required from experts or community. However, it is difficult to get quality answers. The Muslim community all over the world is seeking help to get their questions and issues discussed to get answers. Online web portals of religious schools and community-based question answering sites are two big platforms to solve the issues of users. In the case of religious schools, there are experts and qualified religious scholars (mufti) who can give the expert opinion. However, the quality of community-based content cannot be guaranteed as it may not be an answer that satisfies the question of a user. Users on CQA sites may include spammers or individual criticizing the questioner instead of providing useful answers. In this paper, we research strategies to naturally distinguish the right content. As an experiment, we concentrate on Yahoo! Answers, and Quora, popular online QA sites, where questions are asked, answered, edited, and organized by a large community of users. We present the classification of data to categorize both relevant and irrelevant answers. Specifically, we demonstrate that the proposed framework can isolate quality answers from the rest with an exactness near that of people.

Keywords: Community-based question and answering, evaluation and prediction of quality answer, answer classification, Islamic content, answer ranking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35
1436 An Identification Method of Geological Boundary Using Elastic Waves

Authors: Masamitsu Chikaraishi, Mutsuto Kawahara

Abstract:

This paper focuses on a technique for identifying the geological boundary of the ground strata in front of a tunnel excavation site using the first order adjoint method based on the optimal control theory. The geological boundary is defined as the boundary which is different layers of elastic modulus. At tunnel excavations, it is important to presume the ground situation ahead of the cutting face beforehand. Excavating into weak strata or fault fracture zones may cause extension of the construction work and human suffering. A theory for determining the geological boundary of the ground in a numerical manner is investigated, employing excavating blasts and its vibration waves as the observation references. According to the optimal control theory, the performance function described by the square sum of the residuals between computed and observed velocities is minimized. The boundary layer is determined by minimizing the performance function. The elastic analysis governed by the Navier equation is carried out, assuming the ground as an elastic body with linear viscous damping. To identify the boundary, the gradient of the performance function with respect to the geological boundary can be calculated using the adjoint equation. The weighed gradient method is effectively applied to the minimization algorithm. To solve the governing and adjoint equations, the Galerkin finite element method and the average acceleration method are employed for the spatial and temporal discretizations, respectively. Based on the method presented in this paper, the different boundary of three strata can be identified. For the numerical studies, the Suemune tunnel excavation site is employed. At first, the blasting force is identified in order to perform the accuracy improvement of analysis. We identify the geological boundary after the estimation of blasting force. With this identification procedure, the numerical analysis results which almost correspond with the observation data were provided.

Keywords: Parameter identification, finite element method, average acceleration method, first order adjoint equation method, weighted gradient method, geological boundary, navier equation, optimal control theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
1435 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: Time series modelling, stochastic processes, ARIMA model, Karkheh River.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1020
1434 Development of a Tilt-Rotor Aircraft Model Using System Identification Technique

Authors: Antonio Vitale, Nicola Genito, Giovanni Cuciniello, Ferdinando Montemari

Abstract:

The introduction of tilt-rotor aircraft into the existing civilian air transportation system will provide beneficial effects due to tilt-rotor capability to combine the characteristics of a helicopter and a fixed-wing aircraft into one vehicle. The disposability of reliable tilt-rotor simulation models supports the development of such vehicle. Indeed, simulation models are required to design automatic control systems that increase safety, reduce pilot's workload and stress, and ensure the optimal aircraft configuration with respect to flight envelope limits, especially during the most critical flight phases such as conversion from helicopter to aircraft mode and vice versa. This article presents a process to build a simplified tilt-rotor simulation model, derived from the analysis of flight data. The model aims to reproduce the complex dynamics of tilt-rotor during the in-flight conversion phase. It uses a set of scheduled linear transfer functions to relate the autopilot reference inputs to the most relevant rigid body state variables. The model also computes information about the rotor flapping dynamics, which are useful to evaluate the aircraft control margin in terms of rotor collective and cyclic commands. The rotor flapping model is derived through a mixed theoretical-empirical approach, which includes physical analytical equations (applicable to helicopter configuration) and parametric corrective functions. The latter are introduced to best fit the actual rotor behavior and balance the differences existing between helicopter and tilt-rotor during flight. Time-domain system identification from flight data is exploited to optimize the model structure and to estimate the model parameters. The presented model-building process was applied to simulated flight data of the ERICA Tilt-Rotor, generated by using a high fidelity simulation model implemented in FlightLab environment. The validation of the obtained model was very satisfying, confirming the validity of the proposed approach.

Keywords: Flapping Dynamics, Flight Dynamics, System Identification, Tilt-Rotor Modeling and Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1258
1433 Tolerance of Heavy Metals by Gram Positive Soil Bacteria

Authors: I. V. N. Rathnayake, Mallavarapu Megharaj, Nanthi Bolan, Ravi Naidu

Abstract:

With the intention of screening for heavy metal tolerance, a number of bacteria were isolated and characterized from a pristine soil. Two Gram positive isolates were identified as Paenibacillus sp. and Bacillus thuringeinsis. Tolerance of Cd2+, Cu2+ and Zn2+ by these bacteria was studied and found that both bacteria were highly sensitive to Cu2+ compared to other two metals. Both bacteria showed the same pattern of metal tolerance in the order Zn+ > Cd2+ > Cu2+. When the metal tolerance in both bacteria was compared, Paenibacillus sp. showed the highest sensitivity to Cu2+ where as B. thuringiensis showed highest sensitivity to Cd2+ and Zn2+ .These findings revealed the potential of Paenibacillus sp. in developing a biosensor to detect Cu2+ in environmental samples.

Keywords: Heavy metals, bacteria, soil, tolerance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6973
1432 On-Line Geometrical Identification of Reconfigurable Machine Tool using Virtual Machining

Authors: Alexandru Epureanu, Virgil Teodor

Abstract:

One of the main research directions in CAD/CAM machining area is the reducing of machining time. The feedrate scheduling is one of the advanced techniques that allows keeping constant the uncut chip area and as sequel to keep constant the main cutting force. They are two main ways for feedrate optimization. The first consists in the cutting force monitoring, which presumes to use complex equipment for the force measurement and after this, to set the feedrate regarding the cutting force variation. The second way is to optimize the feedrate by keeping constant the material removal rate regarding the cutting conditions. In this paper there is proposed a new approach using an extended database that replaces the system model. The feedrate scheduling is determined based on the identification of the reconfigurable machine tool, and the feed value determination regarding the uncut chip section area, the contact length between tool and blank and also regarding the geometrical roughness. The first stage consists in the blank and tool monitoring for the determination of actual profiles. The next stage is the determination of programmed tool path that allows obtaining the piece target profile. The graphic representation environment models the tool and blank regions and, after this, the tool model is positioned regarding the blank model according to the programmed tool path. For each of these positions the geometrical roughness value, the uncut chip area and the contact length between tool and blank are calculated. Each of these parameters are compared with the admissible values and according to the result the feed value is established. We can consider that this approach has the following advantages: in case of complex cutting processes the prediction of cutting force is possible; there is considered the real cutting profile which has deviations from the theoretical profile; the blank-tool contact length limitation is possible; it is possible to correct the programmed tool path so that the target profile can be obtained. Applying this method, there are obtained data sets which allow the feedrate scheduling so that the uncut chip area is constant and, as a result, the cutting force is constant, which allows to use more efficiently the machine tool and to obtain the reduction of machining time.

Keywords: Reconfigurable machine tool, system identification, uncut chip area, cutting conditions scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1438
1431 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: Internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1179
1430 Information Retrieval: A Comparative Study of Textual Indexing Using an Oriented Object Database (db4o) and the Inverted File

Authors: Mohammed Erritali

Abstract:

The growth in the volume of text data such as books and articles in libraries for centuries has imposed to establish effective mechanisms to locate them. Early techniques such as abstraction, indexing and the use of classification categories have marked the birth of a new field of research called "Information Retrieval". Information Retrieval (IR) can be defined as the task of defining models and systems whose purpose is to facilitate access to a set of documents in electronic form (corpus) to allow a user to find the relevant ones for him, that is to say, the contents which matches with the information needs of the user. Most of the models of information retrieval use a specific data structure to index a corpus which is called "inverted file" or "reverse index". This inverted file collects information on all terms over the corpus documents specifying the identifiers of documents that contain the term in question, the frequency of each term in the documents of the corpus, the positions of the occurrences of the word... In this paper we use an oriented object database (db4o) instead of the inverted file, that is to say, instead to search a term in the inverted file, we will search it in the db4o database. The purpose of this work is to make a comparative study to see if the oriented object databases may be competing for the inverse index in terms of access speed and resource consumption using a large volume of data.

Keywords: Information Retrieval, indexation, oriented object database (db4o), inverted file.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
1429 Design of a 5-Joint Mechanical Arm with User-Friendly Control Program

Authors: Amon Tunwannarux, Supanunt Tunwannarux

Abstract:

This paper describes the design concepts and implementation of a 5-Joint mechanical arm for a rescue robot named CEO Mission II. The multi-joint arm is a five degree of freedom mechanical arm with a four bar linkage, which can be stretched to 125 cm. long. It is controlled by a teleoperator via the user-friendly control and monitoring GUI program. With Inverse Kinematics principle, we developed the method to control the servo angles of all arm joints to get the desired tip position. By clicking the determined tip position or dragging the tip of the mechanical arm on the computer screen to the desired target point, the robot will compute and move its multi-joint arm to the pose as seen on the GUI screen. The angles of each joint are calculated and sent to all joint servos simultaneously in order to move the mechanical arm to the desired pose at once. The operator can also use a joystick to control the movement of this mechanical arm and the locomotion of the robot. Many sensors are installed at the tip of this mechanical arm for surveillance from the high level and getting the vital signs of victims easier and faster in the urban search and rescue tasks. It works very effectively and easy to control. This mechanical arm and its software were developed as a part of the CEO Mission II Rescue Robot that won the First Runner Up award and the Best Technique award from the Thailand Rescue Robot Championship 2006. It is a low cost, simple, but functioning 5-Jiont mechanical arm which is built from scratch, and controlled via wireless LAN 802.11b/g. This 5-Jiont mechanical arm hardware concept and its software can also be used as the basic mechatronics to many real applications.

Keywords: Multi-joint, mechanical arm, inverse kinematics, rescue robot, GUI control program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
1428 Evolution of Web Development Techniques in Modern Technology

Authors: Abdul Basit Kiani, Maryam Kiani

Abstract:

The art of web development in new technologies is a dynamic journey, shaped by the constant evolution of tools and platforms. With the emergence of JavaScript frameworks and APIs, web developers are empowered to craft web applications that are not only robust but also highly interactive. The aim is to provide an overview of the developments in the field. The integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.

Keywords: Web development, software testing, progressive web apps, web and mobile native application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 307
1427 Concept for Determining the Focus of Technology Monitoring Activities

Authors: Guenther Schuh, Christina Koenig, Nico Schoen, Markus Wellensiek

Abstract:

Identification and selection of appropriate product and manufacturing technologies are key factors for competitiveness and market success of technology-based companies. Therefore, many companies perform technology intelligence (TI) activities to ensure the identification of evolving technologies at the right time. Technology monitoring is one of the three base activities of TI, besides scanning and scouting. As the technological progress is accelerating, more and more technologies are being developed. Against the background of limited resources it is therefore necessary to focus TI activities. In this paper we propose a concept for defining appropriate search fields for technology monitoring. This limitation of search space leads to more concentrated monitoring activities. The concept will be introduced and demonstrated through an anonymized case study conducted within an industry project at the Fraunhofer Institute for Production Technology IPT. The described concept provides a customized monitoring approach, which is suitable for use in technology-oriented companies. It is shown in this paper that the definition of search fields and search tasks are suitable methods to define topics of interest and thus to align monitoring activities. Current as well as planned product, production and material technologies and existing skills, capabilities and resources form the basis for derivation of relevant search areas. To further improve the concept of technology monitoring the proposed concept should be extended during future research e.g. by the definition of relevant monitoring parameters.

Keywords: Monitoring radar, search field, technology intelligence, technology monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3237
1426 Utilizing Analytic Hierarchy Process to Analyze Consumers- Purchase Evaluation Factors of Smartphones

Authors: Yi-Chung Hu, Yu-Lin Liao

Abstract:

Due to the fast development of technology, the competition of technological products is turbulent; therefore, it is important to understand the market trend, consumers- demand and preferences. As the smartphones are prevalent, the main purpose of this paper is to utilize Analytic Hierarchy Process (AHP) to analyze consumer-s purchase evaluation factors of smartphones. Through the AHP expert questionnaire, the smartphones- main functions are classified as “user interface", “mobile commerce functions", “hardware and software specifications", “entertainment functions" and “appearance and design", five aspects to analyze the weights. Then four evaluation criteria are evaluated under each aspect to rank the weights. Based on an analysis of data shows that consumers consider when purchase factors are “hardware and software specifications", “user interface", “appearance and design", “mobile commerce functions" and “entertainment functions" in sequence. The “hardware and software specifications" aspect obtains the weight of 33.18%; it is the most important factor that consumers are taken into account. In addition, the most important evaluation criteria are central processing unit, operating system, touch screen, and battery function in sequence. The results of the study can be adopted as reference data for mobile phone manufacturers in the future on the design and marketing strategy to satisfy the voice of customer.

Keywords: Analytic Hierarchy Process (AHP), evaluation criteria, purchase evaluation factors, smartphone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3224
1425 An Activity Based Trajectory Search Approach

Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar

Abstract:

With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.

Keywords: Location-based recommendation, map-reduce, recommendation system, trajectory search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 966
1424 Implementation of an Improved Secure System Detection for E-passport by using EPC RFID Tags

Authors: A. Baith Mohamed, Ayman Abdel-Hamid, Kareem Youssri Mohamed

Abstract:

Current proposals for E-passport or ID-Card is similar to a regular passport with the addition of tiny contactless integrated circuit (computer chip) inserted in the back cover, which will act as a secure storage device of the same data visually displayed on the photo page of the passport. In addition, it will include a digital photograph that will enable biometric comparison, through the use of facial recognition technology at international borders. Moreover, the e-passport will have a new interface, incorporating additional antifraud and security features. However, its problems are reliability, security and privacy. Privacy is a serious issue since there is no encryption between the readers and the E-passport. However, security issues such as authentication, data protection and control techniques cannot be embedded in one process. In this paper, design and prototype implementation of an improved E-passport reader is presented. The passport holder is authenticated online by using GSM network. The GSM network is the main interface between identification center and the e-passport reader. The communication data is protected between server and e-passport reader by using AES to encrypt data for protection will transferring through GSM network. Performance measurements indicate a 19% improvement in encryption cycles versus previously reported results.

Keywords: RFID "Radio Frequency Identification", EPC"Electronic Product Code", ICAO "International Civil Aviation Organization", IFF "Identify Friend or Foe"

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2590
1423 Optimizing Usability Testing with Collaborative Method in an E-Commerce Ecosystem

Authors: Markandeya Kunchi

Abstract:

Usability testing (UT) is one of the vital steps in the User-centred design (UCD) process when designing a product. In an e-commerce ecosystem, UT becomes primary as new products, features, and services are launched very frequently. And, there are losses attached to the company if an unusable and inefficient product is put out to market and is rejected by customers. This paper tries to answer why UT is important in the product life-cycle of an E-commerce ecosystem. Secondary user research was conducted to find out work patterns, development methods, type of stakeholders, and technology constraints, etc. of a typical E-commerce company. Qualitative user interviews were conducted with product managers and designers to find out the structure, project planning, product management method and role of the design team in a mid-level company. The paper tries to address the usual apprehensions of the company to inculcate UT within the team. As well, it stresses upon factors like monetary resources, lack of usability expert, narrow timelines, and lack of understanding of higher management as some primary reasons. Outsourcing UT to vendors is also very prevalent with mid-level e-commerce companies, but it has its own severe repercussions like very little team involvement, huge cost, misinterpretation of the findings, elongated timelines, and lack of empathy towards the customer, etc. The shortfalls of the unavailability of a UT process in place within the team and conducting UT through vendors are bad user experiences for customers while interacting with the product, badly designed products which are neither useful and nor utilitarian. As a result, companies see dipping conversions rates in apps and websites, huge bounce rates and increased uninstall rates. Thus, there was a need for a more lean UT system in place which could solve all these issues for the company. This paper highlights on optimizing the UT process with a collaborative method. The degree of optimization and structure of collaborative method is the highlight of this paper. Collaborative method of UT is one in which the centralised design team of the company takes for conducting and analysing the UT. The UT is usually a formative kind where designers take findings into account and uses in the ideation process. The success of collaborative method of UT is due to its ability to sync with the product management method employed by the company or team. The collaborative methods focus on engaging various teams (design, marketing, product, administration, IT, etc.) each with its own defined roles and responsibility in conducting a smooth UT with users In-house. The paper finally highlights the positive results of collaborative UT method after conducting more than 100 In-lab interviews with users across the different lines of businesses. Some of which are the improvement of interaction between stakeholders and the design team, empathy towards users, improved design iteration, better sanity check of design solutions, optimization of time and money, effective and efficient design solution. The future scope of collaborative UT is to make this method leaner, by reducing the number of days to complete the entire project starting from planning between teams to publishing the UT report.

Keywords: Usability testing, collaborative method, e-commerce, product management method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 647
1422 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition

Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu

Abstract:

In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.

Keywords: Biometry, image processing, pattern recognition, speech analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
1421 Context Aware Lightweight Energy Efficient Framework

Authors: D. Sathan, A. Meetoo, R. K. Subramaniam

Abstract:

Context awareness is a capability whereby mobile computing devices can sense their physical environment and adapt their behavior accordingly. The term context-awareness, in ubiquitous computing, was introduced by Schilit in 1994 and has become one of the most exciting concepts in early 21st-century computing, fueled by recent developments in pervasive computing (i.e. mobile and ubiquitous computing). These include computing devices worn by users, embedded devices, smart appliances, sensors surrounding users and a variety of wireless networking technologies. Context-aware applications use context information to adapt interfaces, tailor the set of application-relevant data, increase the precision of information retrieval, discover services, make the user interaction implicit, or build smart environments. For example: A context aware mobile phone will know that the user is currently in a meeting room, and reject any unimportant calls. One of the major challenges in providing users with context-aware services lies in continuously monitoring their contexts based on numerous sensors connected to the context aware system through wireless communication. A number of context aware frameworks based on sensors have been proposed, but many of them have neglected the fact that monitoring with sensors imposes heavy workloads on ubiquitous devices with limited computing power and battery. In this paper, we present CALEEF, a lightweight and energy efficient context aware framework for resource limited ubiquitous devices.

Keywords: Context-Aware, Energy-Efficient, Lightweight, Ubiquitous Devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
1420 Designing a Robust Controller for a 6 Linkage Robot

Authors: G. Khamooshian

Abstract:

One of the main points of application of the mechanisms of the series and parallel is the subject of managing them. The control of this mechanism and similar mechanisms is one that has always been the intention of the scholars. On the other hand, modeling the behavior of the system is difficult due to the large number of its parameters, and it leads to complex equations that are difficult to solve and eventually difficult to control. In this paper, a six-linkage robot has been presented that could be used in different areas such as medical robots. Using these robots needs a robust control. In this paper, the system equations are first found, and then the system conversion function is written. A new controller has been designed for this robot which could be used in other parallel robots and could be very useful. Parallel robots are so important in robotics because of their stability, so methods for control of them are important and the robust controller, especially in parallel robots, makes a sense.

Keywords: 3-RRS, 6 linkage, parallel robot, control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 653
1419 A Cost Effective Approach to Develop Mid-size Enterprise Software Adopted the Waterfall Model

Authors: M. N. Hasnine, M. K. H. Chayon, M. M. Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: End-user Application Development, Enterprise Software Design, Information Resource Management, Usability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937
1418 Web Content Mining: A Solution to Consumer's Product Hunt

Authors: Syed Salman Ahmed, Zahid Halim, Rauf Baig, Shariq Bashir

Abstract:

With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.

Keywords: Data mining, web mining, search engines, knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
1417 Information Filtering using Index Word Selection based on the Topics

Authors: Takeru YOKOI, Hidekazu YANAGIMOTO, Sigeru OMATU

Abstract:

We have proposed an information filtering system using index word selection from a document set based on the topics included in a set of documents. This method narrows down the particularly characteristic words in a document set and the topics are obtained by Sparse Non-negative Matrix Factorization. In information filtering, a document is often represented with the vector in which the elements correspond to the weight of the index words, and the dimension of the vector becomes larger as the number of documents is increased. Therefore, it is possible that useless words as index words for the information filtering are included. In order to address the problem, the dimension needs to be reduced. Our proposal reduces the dimension by selecting index words based on the topics included in a document set. We have applied the Sparse Non-negative Matrix Factorization to the document set to obtain these topics. The filtering is carried out based on a centroid of the learning document set. The centroid is regarded as the user-s interest. In addition, the centroid is represented with a document vector whose elements consist of the weight of the selected index words. Using the English test collection MEDLINE, thus, we confirm the effectiveness of our proposal. Hence, our proposed selection can confirm the improvement of the recommendation accuracy from the other previous methods when selecting the appropriate number of index words. In addition, we discussed the selected index words by our proposal and we found our proposal was able to select the index words covered some minor topics included in the document set.

Keywords: Information Filtering, Sparse NMF, Index wordSelection, User Profile, Chi-squared Measure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1440
1416 Elegant: An Intuitive Software Tool for Interactive Learning of Power System Analysis

Authors: Eduardo N. Velloso, Fernando M. N. Dantas, Luciano S. Barros

Abstract:

A common complaint from power system analysis students lies in the overly complex tools they need to learn and use just to simulate very basic systems or just to check the answers to power system calculations. The most basic power system studies are power-flow solutions and short-circuit calculations. This paper presents a simple tool with an intuitive interface to perform both these studies and assess its performance in comparison with existent commercial solutions. With this in mind, Elegant is a pure Python software tool for learning power system analysis developed for undergraduate and graduate students. It solves the power-flow problem by iterative numerical methods and calculates bolted short-circuit fault currents by modeling the network in the domain of symmetrical components. Elegant can be used with a user-friendly Graphical User Interface (GUI) and automatically generates human-readable reports of the simulation results. The tool is exemplified using a typical Brazilian regional system with 18 buses. This study performs a comparative experiment with 1 undergraduate and 4 graduate students who attempted the same problem using both Elegant and a commercial tool. It was found that Elegant significantly reduces the time and labor involved in basic power system simulations while still providing some insights into real power system designs.

Keywords: Free- and open-source software, power-flow, power system analysis, Python, short-circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430
1415 Service Architecture for 3rd Party Operator's Participation

Authors: F. Sarabchi, A. H. Darvishan, H. Yeganeh, H. Ahmadian

Abstract:

Next generation networks with the idea of convergence of service and control layer in existing networks (fixed, mobile and data) and with the intention of providing services in an integrated network, has opened new horizon for telecom operators. On the other hand, economic problems have caused operators to look for new source of income including consider new services, subscription of more users and their promotion in using morenetwork resources and easy participation of service providers or 3rd party operators in utilizing networks. With this requirement, an architecture based on next generation objectives for service layer is necessary. In this paper, a new architecture based on IMS model explains participation of 3rd party operators in creation and implementation of services on an integrated telecom network.

Keywords: Service model, IMS, API, Scripting language, JAIN, Parlay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1457
1414 Embedded Semi-Fragile Signature Based Scheme for Ownership Identification and Color Image Authentication with Recovery

Authors: M. Hamad Hassan, S.A.M. Gilani

Abstract:

In this paper, a novel scheme is proposed for Ownership Identification and Color Image Authentication by deploying Cryptography & Digital Watermarking. The color image is first transformed from RGB to YST color space exclusively designed for watermarking. Followed by color space transformation, each channel is divided into 4×4 non-overlapping blocks with selection of central 2×2 sub-blocks. Depending upon the channel selected two to three LSBs of each central 2×2 sub-block are set to zero to hold the ownership, authentication and recovery information. The size & position of sub-block is important for correct localization, enhanced security & fast computation. As YS ÔèÑ T so it is suitable to embed the recovery information apart from the ownership and authentication information, therefore 4×4 block of T channel along with ownership information is then deployed by SHA160 to compute the content based hash that is unique and invulnerable to birthday attack or hash collision instead of using MD5 that may raise the condition i.e. H(m)=H(m'). For recovery, intensity mean of 4x4 block of each channel is computed and encoded upto eight bits. For watermark embedding, key based mapping of blocks is performed using 2DTorus Automorphism. Our scheme is oblivious, generates highly imperceptible images with correct localization of tampering within reasonable time and has the ability to recover the original work with probability of near one.

Keywords: Hash Collision, LSB, MD5, PSNR, SHA160

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
1413 Identification of Most Frequently Occurring Lexis in Body-enhancement Medicinal Unsolicited Bulk e-mails

Authors: Jatinderkumar R. Saini, Apurva A. Desai

Abstract:

e-mail has become an important means of electronic communication but the viability of its usage is marred by Unsolicited Bulk e-mail (UBE) messages. UBE consists of many types like pornographic, virus infected and 'cry-for-help' messages as well as fake and fraudulent offers for jobs, winnings and medicines. UBE poses technical and socio-economic challenges to usage of e-mails. To meet this challenge and combat this menace, we need to understand UBE. Towards this end, the current paper presents a content-based textual analysis of more than 2700 body enhancement medicinal UBE. Technically, this is an application of Text Parsing and Tokenization for an un-structured textual document and we approach it using Bag Of Words (BOW) and Vector Space Document Model techniques. We have attempted to identify the most frequently occurring lexis in the UBE documents that advertise various products for body enhancement. The analysis of such top 100 lexis is also presented. We exhibit the relationship between occurrence of a word from the identified lexis-set in the given UBE and the probability that the given UBE will be the one advertising for fake medicinal product. To the best of our knowledge and survey of related literature, this is the first formal attempt for identification of most frequently occurring lexis in such UBE by its textual analysis. Finally, this is a sincere attempt to bring about alertness against and mitigate the threat of such luring but fake UBE.

Keywords: Body Enhancement, Lexis, Medicinal, Unsolicited Bulk e-mail (UBE), Vector Space Document Model, Viagra

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3484
1412 Politic Iconography: The Sky and Pants of Nicolas-Antoine Taunay (1755-1830)

Authors: Bárbara Dantas

Abstract:

Nicolas-Antoine Taunay had everything to have a quiet life with his family, his colleagues from the Paris Academy of Art, and as a renowned painter of the French Court, but the conjuncture was quite complicated in those final years of the eighteenth century and first decades of the 19th century. The painter had to adapt to various political and social ruptures: from royalty to the French Revolution, from the empire of Napoleon Bonaparte to the empire of King John VI. We wish to insert Taunay in its context through the analysis of his portrait made by a colleague of the profession and of a Brazilian landscape painted of his own (1816-1821) and, in which he represented himself. Finally, the intention is to find in these two paintings how Nicolas-Antoine Taunay faced himself and in the middle that surrounded him in the traffic that was forced to make it between Paris and Rio de Janeiro.

Keywords: Nicolas-Antoine Taunay, politic iconography, French Art, Brazilian Art, 19th century.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202
1411 A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

Authors: J. Prakash, K. Rajesh

Abstract:

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Keywords: Circular Hough Transform, Coin detection, Covariance matrix, Eigenvalues, Raster scan Algorithm, Texton.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
1410 Design of Orientation-Free Handler and Fuzzy Controller for Wire-Driven Heavy Object Lifting System

Authors: Bo-Wei Song, Yun-Jung Lee

Abstract:

This paper presents an intention interface and controller for a wire-driven heavy object lifting system that assists the operator with moving a heavy object. The handler is designed to allow a comfortable working posture for the operator. Plus, as a human assistive system, the operator is involved in the control loop, where a fuzzy control system is used to consider the human control characteristics. The effectiveness and performance of the proposed system are proved by experiments.

Keywords: Fuzzy controller, Handler design, Heavy object lifting system, Human-assistive device, Human-in-the-loop system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
1409 Extraction of Data from Web Pages: A Vision Based Approach

Authors: P. S. Hiremath, Siddu P. Algur

Abstract:

With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.

Keywords: Web data records, web data regions, web mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886