Search results for: user dialog interface
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3266

Search results for: user dialog interface

2876 Mechanical Behaviours of Ti/GFRP/Ti Laminates with Different Surface Treatments of Titanium Sheets

Authors: Amit Kumar Haldar, Mark Simms, Ian McDevitt, Anthony Comer

Abstract:

Interface properties of fiber metal laminates (FML) affects the integrity and deformation failure modes. In this paper, the mechanical behaviours of Ti/GFRP/Ti laminates were experimentally investigated through low-velocity impact tests. Two different surface treatments of Titanium (Ti-6Al-4V) alloy sheets were prepared to obtain the composite interface properties based on annealing and sandblast surface treatment processes. The deformation failure modes, impact load sustaining ability and energy absorption capacity of FMLs were analysed. The impact load and modulus were shown to be dependent on the surface treatments of Titanium (Ti-6Al-4V) alloy sheets. It was demonstrated that the impact load performance was enhanced when titanium surfaces were annealed and sandblasted. It has also been shown that the values of the strength and energy absorption were slightly higher when the tests conducted at relatively higher loading rate, as a result of the rate-sensitive effects on the damage resistance of the FML.

Keywords: fiber metal laminates, metal composite interface, indentation, low velocity impact

Procedia PDF Downloads 176
2875 Virtual Reality and Other Real-Time Visualization Technologies for Architecture Energy Certifications

Authors: Román Rodríguez Echegoyen, Fernando Carlos López Hernández, José Manuel López Ujaque

Abstract:

Interactive management of energy certification ratings has remained on the sidelines of the evolution of virtual reality (VR) despite related advances in architecture in other areas such as BIM and real-time working programs. This research studies to what extent VR software can help the stakeholders to better understand energy efficiency parameters in order to obtain reliable ratings assigned to the parts of the building. To evaluate this hypothesis, the methodology has included the construction of a software prototype. Current energy certification systems do not follow an intuitive data entry system; neither do they provide a simple or visual verification of the technical values included in the certification by manufacturers or other users. This software, by means of real-time visualization and a graphical user interface, proposes different improvements to the current energy certification systems that ease the understanding of how the certification parameters work in a building. Furthermore, the difficulty of using current interfaces, which are not friendly or intuitive for the user, means that untrained users usually get a poor idea of the grounds for certification and how the program works. In addition, the proposed software allows users to add further information, such as financial and CO₂ savings, energy efficiency, and an explanatory analysis of results for the least efficient areas of the building through a new visual mode. The software also helps the user to evaluate whether or not an investment to improve the materials of an installation is worth the cost of the different energy certification parameters. The evaluated prototype (named VEE-IS) shows promising results when it comes to representing in a more intuitive and simple manner the energy rating of the different elements of the building. Users can also personalize all the inputs necessary to create a correct certification, such as floor materials, walls, installations, or other important parameters. Working in real-time through VR allows for efficiently comparing, analyzing, and improving the rated elements, as well as the parameters that we must enter to calculate the final certification. The prototype also allows for visualizing the building in efficiency mode, which lets us move over the building to analyze thermal bridges or other energy efficiency data. This research also finds that the visual representation of energy efficiency certifications makes it easy for the stakeholders to examine improvements progressively, which adds value to the different phases of design and sale.

Keywords: energetic certification, virtual reality, augmented reality, sustainability

Procedia PDF Downloads 168
2874 Uplift Segmentation Approach for Targeting Customers in a Churn Prediction Model

Authors: Shivahari Revathi Venkateswaran

Abstract:

Segmenting customers plays a significant role in churn prediction. It helps the marketing team with proactive and reactive customer retention. For the reactive retention, the retention team reaches out to customers who already showed intent to disconnect by giving some special offers. When coming to proactive retention, the marketing team uses churn prediction model, which ranks each customer from rank 1 to 100, where 1 being more risk to churn/disconnect (high ranks have high propensity to churn). The churn prediction model is built by using XGBoost model. However, with the churn rank, the marketing team can only reach out to the customers based on their individual ranks. To profile different groups of customers and to frame different marketing strategies for targeted groups of customers are not possible with the churn ranks. For this, the customers must be grouped in different segments based on their profiles, like demographics and other non-controllable attributes. This helps the marketing team to frame different offer groups for the targeted audience and prevent them from disconnecting (proactive retention). For segmentation, machine learning approaches like k-mean clustering will not form unique customer segments that have customers with same attributes. This paper finds an alternate approach to find all the combination of unique segments that can be formed from the user attributes and then finds the segments who have uplift (churn rate higher than the baseline churn rate). For this, search algorithms like fast search and recursive search are used. Further, for each segment, all customers can be targeted using individual churn ranks from the churn prediction model. Finally, a UI (User Interface) is developed for the marketing team to interactively search for the meaningful segments that are formed and target the right set of audience for future marketing campaigns and prevent them from disconnecting.

Keywords: churn prediction modeling, XGBoost model, uplift segments, proactive marketing, search algorithms, retention, k-mean clustering

Procedia PDF Downloads 54
2873 Modeling of Global Solar Radiation on a Horizontal Surface Using Artificial Neural Network: A Case Study

Authors: Laidi Maamar, Hanini Salah

Abstract:

The present work investigates the potential of artificial neural network (ANN) model to predict the horizontal global solar radiation (HGSR). The ANN is developed and optimized using three years meteorological database from 2011 to 2013 available at the meteorological station of Blida (Blida 1 university, Algeria, Latitude 36.5°, Longitude 2.81° and 163 m above mean sea level). Optimal configuration of the ANN model has been determined by minimizing the Root Means Square Error (RMSE) and maximizing the correlation coefficient (R2) between observed and predicted data with the ANN model. To select the best ANN architecture, we have conducted several tests by using different combinations of parameters. A two-layer ANN model with six hidden neurons has been found as an optimal topology with (RMSE=4.036 W/m²) and (R²=0.999). A graphical user interface (GUI), was designed based on the best network structure and training algorithm, to enhance the users’ friendliness application of the model.

Keywords: artificial neural network, global solar radiation, solar energy, prediction, Algeria

Procedia PDF Downloads 478
2872 Multiscale Cohesive Zone Modeling of Composite Microstructure

Authors: Vincent Iacobellis, Kamran Behdinan

Abstract:

A finite element cohesive zone model is used to predict the temperature dependent material properties of a polyimide matrix composite with unidirectional carbon fiber arrangement. The cohesive zone parameters have been obtained from previous research involving an atomistic-to-continuum multiscale simulation of the fiber-matrix interface using the bridging cell multiscale method. The goal of the research was to both investigate the effect of temperature change on the composite behavior with respect to transverse loading as well as the validate the use of cohesive parameters obtained from atomistic-to-continuum multiscale modeling to predict fiber-matrix interfacial cracking. From the multiscale model cohesive zone parameters (i.e. maximum traction and energy of separation) were obtained by modeling the interface between the coarse-grained polyimide matrix and graphite based carbon fiber. The cohesive parameters from this simulation were used in a cohesive zone model of the composite microstructure in order to predict the properties of the macroscale composite with respect to changes in temperature ranging from 21 ˚C to 316 ˚C. Good agreement was found between the microscale RUC model and experimental results for stress-strain response, stiffness, and material strength at low and high temperatures. Examination of the deformation of the composite through localized crack initiation at the fiber-matrix interface also agreed with experimental observations of similar phenomena. Overall, the cohesive zone model was shown to be both effective at modeling the composite properties with respect to transverse loading as well as validated the use of cohesive zone parameters obtained from the multiscale simulation.

Keywords: cohesive zone model, fiber-matrix interface, microscale damage, multiscale modeling

Procedia PDF Downloads 456
2871 The Analyzer: Clustering Based System for Improving Business Productivity by Analyzing User Profiles to Enhance Human Computer Interaction

Authors: Dona Shaini Abhilasha Nanayakkara, Kurugamage Jude Pravinda Gregory Perera

Abstract:

E-commerce platforms have revolutionized the shopping experience, offering convenient ways for consumers to make purchases. To improve interactions with customers and optimize marketing strategies, it is essential for businesses to understand user behavior, preferences, and needs on these platforms. This paper focuses on recommending businesses to customize interactions with users based on their behavioral patterns, leveraging data-driven analysis and machine learning techniques. Businesses can improve engagement and boost the adoption of e-commerce platforms by aligning behavioral patterns with user goals of usability and satisfaction. We propose TheAnalyzer, a clustering-based system designed to enhance business productivity by analyzing user-profiles and improving human-computer interaction. The Analyzer seamlessly integrates with business applications, collecting relevant data points based on users' natural interactions without additional burdens such as questionnaires or surveys. It defines five key user analytics as features for its dataset, which are easily captured through users' interactions with e-commerce platforms. This research presents a study demonstrating the successful distinction of users into specific groups based on the five key analytics considered by TheAnalyzer. With the assistance of domain experts, customized business rules can be attached to each group, enabling The Analyzer to influence business applications and provide an enhanced personalized user experience. The outcomes are evaluated quantitatively and qualitatively, demonstrating that utilizing TheAnalyzer’s capabilities can optimize business outcomes, enhance customer satisfaction, and drive sustainable growth. The findings of this research contribute to the advancement of personalized interactions in e-commerce platforms. By leveraging user behavioral patterns and analyzing both new and existing users, businesses can effectively tailor their interactions to improve customer satisfaction, loyalty and ultimately drive sales.

Keywords: data clustering, data standardization, dimensionality reduction, human computer interaction, user profiling

Procedia PDF Downloads 57
2870 Integrated Model for Enhancing Data Security Processing Time in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a simple user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud computing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 333
2869 Proactive Disk Defragmentation through User's File-Access Patterns

Authors: Gordon Wong

Abstract:

This paper shows how the task of disk defragmentation can be handled by modern operating systems in a transparent, automated, efficient, and confined way through user's file-access patterns. Since files tend to gradually fragment from time to time through file creation, deletion, growth, and shrinking, the problem gets even worse when a disk becomes so fragmented that file accesses cannot be made reasonably efficient without performing the operation of defragmentation for the "entire" disk, which is done manually by the user by launching the disk defragmentation utility program normally bundled with the operating system. In this paper, we argue that the disk defragmentation problem described can be solved without having to manually use the utility program to defragment the entire disk. The argument is based on the observation that system users tend to access certain files in a particular time interval like the way observed for programs exhibiting temporal locality of memory references during their execution. The task of disk defragmentation can be initiated and acted upon for those files contained in the current file-access locality detected and identified by the operating system. The paper also discusses how to use the locality of file references approach to quantitatively measure and determine the locality of user's file access patterns on which the task of disk defragmentation is based.

Keywords: operating systems, disk defragmentation, locality of file accesses, system performance

Procedia PDF Downloads 33
2868 PsyVBot: Chatbot for Accurate Depression Diagnosis using Long Short-Term Memory and NLP

Authors: Thaveesha Dheerasekera, Dileeka Sandamali Alwis

Abstract:

The escalating prevalence of mental health issues, such as depression and suicidal ideation, is a matter of significant global concern. It is plausible that a variety of factors, such as life events, social isolation, and preexisting physiological or psychological health conditions, could instigate or exacerbate these conditions. Traditional approaches to diagnosing depression entail a considerable amount of time and necessitate the involvement of adept practitioners. This underscores the necessity for automated systems capable of promptly detecting and diagnosing symptoms of depression. The PsyVBot system employs sophisticated natural language processing and machine learning methodologies, including the use of the NLTK toolkit for dataset preprocessing and the utilization of a Long Short-Term Memory (LSTM) model. The PsyVBot exhibits a remarkable ability to diagnose depression with a 94% accuracy rate through the analysis of user input. Consequently, this resource proves to be efficacious for individuals, particularly those enrolled in academic institutions, who may encounter challenges pertaining to their psychological well-being. The PsyVBot employs a Long Short-Term Memory (LSTM) model that comprises a total of three layers, namely an embedding layer, an LSTM layer, and a dense layer. The stratification of these layers facilitates a precise examination of linguistic patterns that are associated with the condition of depression. The PsyVBot has the capability to accurately assess an individual's level of depression through the identification of linguistic and contextual cues. The task is achieved via a rigorous training regimen, which is executed by utilizing a dataset comprising information sourced from the subreddit r/SuicideWatch. The diverse data present in the dataset ensures precise and delicate identification of symptoms linked with depression, thereby guaranteeing accuracy. PsyVBot not only possesses diagnostic capabilities but also enhances the user experience through the utilization of audio outputs. This feature enables users to engage in more captivating and interactive interactions. The PsyVBot platform offers individuals the opportunity to conveniently diagnose mental health challenges through a confidential and user-friendly interface. Regarding the advancement of PsyVBot, maintaining user confidentiality and upholding ethical principles are of paramount significance. It is imperative to note that diligent efforts are undertaken to adhere to ethical standards, thereby safeguarding the confidentiality of user information and ensuring its security. Moreover, the chatbot fosters a conducive atmosphere that is supportive and compassionate, thereby promoting psychological welfare. In brief, PsyVBot is an automated conversational agent that utilizes an LSTM model to assess the level of depression in accordance with the input provided by the user. The demonstrated accuracy rate of 94% serves as a promising indication of the potential efficacy of employing natural language processing and machine learning techniques in tackling challenges associated with mental health. The reliability of PsyVBot is further improved by the fact that it makes use of the Reddit dataset and incorporates Natural Language Toolkit (NLTK) for preprocessing. PsyVBot represents a pioneering and user-centric solution that furnishes an easily accessible and confidential medium for seeking assistance. The present platform is offered as a modality to tackle the pervasive issue of depression and the contemplation of suicide.

Keywords: chatbot, depression diagnosis, LSTM model, natural language process

Procedia PDF Downloads 44
2867 Knowledge-Driven Decision Support System Based on Knowledge Warehouse and Data Mining by Improving Apriori Algorithm with Fuzzy Logic

Authors: Pejman Hosseinioun, Hasan Shakeri, Ghasem Ghorbanirostam

Abstract:

In recent years, we have seen an increasing importance of research and study on knowledge source, decision support systems, data mining and procedure of knowledge discovery in data bases and it is considered that each of these aspects affects the others. In this article, we have merged information source and knowledge source to suggest a knowledge based system within limits of management based on storing and restoring of knowledge to manage information and improve decision making and resources. In this article, we have used method of data mining and Apriori algorithm in procedure of knowledge discovery one of the problems of Apriori algorithm is that, a user should specify the minimum threshold for supporting the regularity. Imagine that a user wants to apply Apriori algorithm for a database with millions of transactions. Definitely, the user does not have necessary knowledge of all existing transactions in that database, and therefore cannot specify a suitable threshold. Our purpose in this article is to improve Apriori algorithm. To achieve our goal, we tried using fuzzy logic to put data in different clusters before applying the Apriori algorithm for existing data in the database and we also try to suggest the most suitable threshold to the user automatically.

Keywords: decision support system, data mining, knowledge discovery, data discovery, fuzzy logic

Procedia PDF Downloads 311
2866 Component Interface Formalization in Robotic Systems

Authors: Anton Hristozov, Eric Matson, Eric Dietz, Marcus Rogers

Abstract:

Components are heavily used in many software systems, including robotics systems. The growth of sophistication and diversity of new capabilities for robotic systems presents new challenges to their architectures. Their complexity is growing exponentially with the advent of AI, smart sensors, and the complex tasks they have to accomplish. Such complexity requires a more rigorous approach to the creation, use, and interoperability of software components. The issue is exacerbated because robotic systems are becoming more and more reliant on third-party components for certain functions. In order to achieve this kind of interoperability, including dynamic component replacement, we need a way to standardize their interfaces. A formal approach is desperately needed to specify what an interface of a robotic software component should contain. This study performs an analysis of the issue and presents a universal and generic approach to standardizing component interfaces for robotic systems. Our approach is inspired by well-established robotic architectures such as ROS, PX4, and Ardupilot. The study is also applicable to other software systems that share similar characteristics with robotic systems. We consider the use of JSON or Domain Specific Languages (DSL) development with tools such as Antlr and automatic code and configuration file generation for frameworks such as ROS and PX4. A case study with ROS2 is presented as a proof of concept for the proposed methodology.

Keywords: CPS, robots, software architecture, interface, ROS, autopilot

Procedia PDF Downloads 69
2865 Player Experience: A Research on Cross-Platform Supported Games

Authors: Salih Akkemik

Abstract:

User Experience has a characterized perspective based on two fundamentals: the usage process and the product. Digital games can be considered as a special interactive system. This system has a very specific purpose and this is to make the player feel good while playing. At this point, Player Experience (PX) and User Experience (UX) are similar. UX focuses on the user feels good, PX focuses on the player feels good. The most important difference between the two is the action taken. These are actions of using and playing. In this study, the player experience will be examined primarily. PX may differ on different platforms. Nowadays, companies are releasing the successful and high-income games that they have developed with cross-platform support. Cross-platform is the most common expression that an application can run on different operating systems, in other words, be developed to support different operating systems. In terms of digital games, cross-platform support means that a game can be played on a computer, console or mobile device environment, more specifically, the game developed is designed and programmed to be played in the same way on at least two different platforms, such as Windows, MacOS, Linux, iOS, Android, Orbis OS or Xbox OS. Different platforms also accommodate different player groups, profiles and preferences. This study aims to examine these different player profiles in terms of player experience and to determine the effects of cross-platform support on player experience.

Keywords: cross-platform, digital games, player experience, user experience

Procedia PDF Downloads 188
2864 Device Control Using Brain Computer Interface

Authors: P. Neeraj, Anurag Sharma, Harsukhpreet Singh

Abstract:

In current years, Brain-Computer Interface (BCI) scheme based on steady-state Visual Evoked Potential (SSVEP) have earned much consideration. This study tries to evolve an SSVEP based BCI scheme that can regulate any gadget mock-up in two unique positions ON and OFF. In this paper, two distinctive gleam frequencies in low-frequency part were utilized to evoke the SSVEPs and were shown on a Liquid Crystal Display (LCD) screen utilizing Lab View. Two stimuli shading, Yellow, and Blue were utilized to prepare the system in SSVEPs. The Electroencephalogram (EEG) signals recorded from the occipital part. Elements of the brain were separated by utilizing discrete wavelet Transform. A prominent system for multilayer system diverse Neural Network Algorithm (NNA), is utilized to characterize SSVEP signals. During training of the network with diverse calculation Regression plot results demonstrated that when Levenberg-Marquardt preparing calculation was utilized the exactness turns out to be 93.9%, which is superior to another training algorithm.

Keywords: brain computer interface, electroencephalography, steady-state visual evoked potential, wavelet transform, neural network

Procedia PDF Downloads 317
2863 Meta Root ID Passwordless Authentication Using ZKP Bitcoin Protocol

Authors: Saransh Sharma, Atharv Dekhne

Abstract:

Passwords stored on central services and hashed are prone to cyberattacks and hacks. Hence, given all these nuisances, there’s a need to eliminate character-based authentication protocols, which would ultimately benefit all developers as well as end-users.To replace this conventional but antiquated protocol with a secure alternative would be Passwordless Authentication. The meta root.id system creates a public and private key, of which the user is only able to access the private key. Further, after signing the key, the user sends the information over the API to the server, which checks its validity with the public key and grants access accordingly.

Keywords: passwordless, OAuth, bitcoin, ZKP, SIN, BIP

Procedia PDF Downloads 69
2862 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem-solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text-based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text-based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources

Procedia PDF Downloads 367
2861 Power Allocation in User-Centric Cell-Free Massive Multiple-Input Multiple-Output Systems with Limited Fronthaul Capacity

Authors: Siminfar Samakoush Galougah

Abstract:

In this paper, we study two power allocation problems for an uplink user-centric (UC) cell-free massive multiple-input multiple-output (CF-mMIMO) system. Besides, we assume each access point (AP) is connected to a central processing unit (CPU) via a fronthaul link with limited capacity. To efficiently use the fronthaul capacity, two strategies for transmitting signals from APs to the CPU are employed, namely, compress-forward estimate (CFE), estimate-compress-forward (ECF). The capacity of the aforementioned strategies in user-centric CF-mMIMO is drived. Then, we solved the two power allocation problems with minimum Spectral Efficiency (SE) and sum-SE maximization objectives for ECF and CFE strategies.

Keywords: cell-free massive MIMO, limited capacity fronthaul, spectral efficiency

Procedia PDF Downloads 35
2860 Interface Designer as Cultural Producer: A Dialectic Materialist Approach to the Role of Visual Designer in the Present Digital Era

Authors: Cagri Baris Kasap

Abstract:

In this study, how interface designers can be viewed as producers of culture in the current era will be interrogated from a critical theory perspective. Walter Benjamin was a German Jewish literary critical theorist who, during 1930s, was engaged in opposing and criticizing the Nazi use of art and media. ‘The Author as Producer’ is an essay that Benjamin has read at the Communist Institute for the Study of Fascism in Paris. In this article, Benjamin relates directly to the dialectics between base and superstructure and argues that authors, normally placed within the superstructure should consider how writing and publishing is production and directly related to the base. Through it, he discusses what it could mean to see author as producer of his own text, as a producer of writing, understood as an ideological construct that rests on the apparatus of production and distribution. So Benjamin concludes that the author must write in ways that relate to the conditions of production, he must do so in order to prepare his readers to become writers and even make this possible for them by engineering an ‘improved apparatus’ and must work toward turning consumers to producers and collaborators. In today’s world, it has become a leading business model within Web 2.0 services of multinational Internet technologies and culture industries like Amazon, Apple and Google, to transform readers, spectators, consumers or users into collaborators and co-producers through platforms such as Facebook, YouTube and Amazon’s CreateSpace Kindle Direct Publishing print-on-demand, e-book and publishing platforms. However, the way this transformation happens is tightly controlled and monitored by combinations of software and hardware. In these global-market monopolies, it has become increasingly difficult to get insight into how one’s writing and collaboration is used, captured, and capitalized as a user of Facebook or Google. In the lens of this study, it could be argued that this criticism could very well be considered by digital producers or even by the mass of collaborators in contemporary social networking software. How do software and design incorporate users and their collaboration? Are they truly empowered, are they put in a position where they are able to understand the apparatus and how their collaboration is part of it? Or has the apparatus become a means against the producers? Thus, when using corporate systems like Google and Facebook, iPhone and Kindle without any control over the means of production, which is closed off by opaque interfaces and licenses that limit our rights of use and ownership, we are already the collaborators that Benjamin calls for. For example, the iPhone and the Kindle combine a specific use of technology to distribute the relations between the ‘authors’ and the ‘prodUsers’ in ways that secure their monopolistic business models by limiting the potential of the technology.

Keywords: interface designer, cultural producer, Walter Benjamin, materialist aesthetics, dialectical thinking

Procedia PDF Downloads 122
2859 A Non-Iterative Shape Reconstruction of an Interface from Boundary Measurement

Authors: Mourad Hrizi

Abstract:

In this paper, we study the inverse problem of reconstructing an interior interface D appearing in the elliptic partial differential equation: Δu+χ(D)u=0 from the knowledge of the boundary measurements. This problem arises from a semiconductor transistor model. We propose a new shape reconstruction procedure that is based on the Kohn-Vogelius formulation and the topological sensitivity method. The inverse problem is formulated as a topology optimization one. A topological sensitivity analysis is derived from a function. The unknown subdomain D is reconstructed using a level-set curve of the topological gradient. Finally, we give several examples to show the viability of our proposed method.

Keywords: inverse problem, topological optimization, topological gradient, Kohn-Vogelius formulation

Procedia PDF Downloads 223
2858 Pro-BluCRM: A Proactive Customer Relationship Management System Using Bluetooth

Authors: Mohammad Alawairdhi

Abstract:

Customer Relationship Management (CRM) started gaining attention as late as the 1990s, and since then efforts are ongoing to define the domain’s precise specifications. There is yet no single agreed upon definition. However, a predominant majority perceives CRM as a mechanism for enhancing interaction with customers, thereby strengthening the relationship between a business and its clients. From the perspective of Information Technology (IT) companies, CRM systems can be viewed as facilitating software products or services to automate the marketing, selling and servicing functions of an organization. In this paper, we have proposed a Bluetooth enabled CRM system for small- and medium-scale organizations. In the proposed system, Bluetooth technology works as an automatic identification token in addition to its common use as a communication channel. The system comprises a server side accompanied by a user-interface support for both client and server sides. The system has been tested in two environments and users have expressed ease of use, convenience and understandability as major advantages of the proposed solution.

Keywords: customer relationship management, CRM, bluetooth, automatic identification token

Procedia PDF Downloads 327
2857 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications

Authors: Chia-Ju Peng, Shih-Jui Chen

Abstract:

This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.

Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation

Procedia PDF Downloads 375
2856 A User Centred Based Approach for Designing Everyday Product: A Case Study of an Alarm Clock

Authors: Obokhai Kess Asikhia

Abstract:

This work explores design concept generation by understanding user needs through observation and interview. The aim is to examine several principles and guidelines in obtaining evidence from observing how users interact with the targeted product and interviewing them to acquire deep insights of their needs. With the help of Quality Function Deployment (QFD), the identified needs of the users while interacting with the product were ranked using the normalised weighting approach. Furthermore, a low fidelity prototype of the alarm clock is developed with a view of addressing the identified needs of the users. Finally, the low fidelity prototype design was evaluated with two design prototypes already existing in the market through a study involving 30 participants. Preliminary results reveal higher performance ratings by the majority of the participants of the new prototype compared to the other existing alarm clocks in the market used in the study.

Keywords: design concept, low fidelity prototype, normalised weighting approach, quality function deployment, user needs

Procedia PDF Downloads 161
2855 Discovering User Behaviour Patterns from Web Log Analysis to Enhance the Accessibility and Usability of Website

Authors: Harpreet Singh

Abstract:

Finding relevant information on the World Wide Web is becoming highly challenging day by day. Web usage mining is used for the extraction of relevant and useful knowledge, such as user behaviour patterns, from web access log records. Web access log records all the requests for individual files that the users have requested from the website. Web usage mining is important for Customer Relationship Management (CRM), as it can ensure customer satisfaction as far as the interaction between the customer and the organization is concerned. Web usage mining is helpful in improving website structure or design as per the user’s requirement by analyzing the access log file of a website through a log analyzer tool. The focus of this paper is to enhance the accessibility and usability of a guitar selling web site by analyzing their access log through Deep Log Analyzer tool. The results show that the maximum number of users is from the United States and that they use Opera 9.8 web browser and the Windows XP operating system.

Keywords: web usage mining, web mining, log file, data mining, deep log analyzer

Procedia PDF Downloads 228
2854 Evaluation of Corrosion by Impedance Spectroscopy of Embedded Steel in an Alternative Concrete Exposed a Chloride Ion

Authors: E. Ruíz, W. Aperador

Abstract:

In this article evaluates the protective effect of the concrete alternative obtained from the fly ash and iron and steel slag mixed in binary form and were placed on structural steel ASTM A 706. The study was conducted comparatively with specimens exposed to natural conditions free of chloride ion. The effect of chloride ion on the specimens was generated of form accelerated under controlled conditions (3.5% NaCl and 25 ° C temperature). The Impedance data were acquired over a range of 1 mHz to 100 kHz. At frequencies high is found the response of the interface means of the exposure-concrete and to frequency low the response of the interface corresponding to concrete-steel.

Keywords: alternative concrete, corrosion, alkaline activation, impedance spectroscopy

Procedia PDF Downloads 339
2853 Problem of Services Selection in Ubiquitous Systems

Authors: Malika Yaici, Assia Arab, Betitra Yakouben, Samia Zermani

Abstract:

Ubiquitous computing is nowadays a reality through the networking of a growing number of computing devices. It allows providing users with context aware information and services in a heterogeneous environment, anywhere and anytime. Selection of the best context-aware service, between many available services and providers, is a tedious problem. In this paper, a service selection method based on Constraint Satisfaction Problem (CSP) formalism is proposed. The services are considered as variables and domains; and the user context, preferences and providers characteristics are considered as constraints. The Backtrack algorithm is used to solve the problem to find the best service and provider which matches the user requirements. Even though this algorithm has an exponential complexity, but its use guarantees that the service, that best matches the user requirements, will be found. A comparison of the proposed method with the existing solutions finishes the paper.

Keywords: ubiquitous computing, services selection, constraint satisfaction problem, backtrack algorithm

Procedia PDF Downloads 218
2852 User Authentication Using Graphical Password with Sound Signature

Authors: Devi Srinivas, K. Sindhuja

Abstract:

This paper presents architecture to improve surveillance applications based on the usage of the service oriented paradigm, with smart phones as user terminals, allowing application dynamic composition and increasing the flexibility of the system. According to the result of moving object detection research on video sequences, the movement of the people is tracked using video surveillance. The moving object is identified using the image subtraction method. The background image is subtracted from the foreground image, from that the moving object is derived. So the Background subtraction algorithm and the threshold value is calculated to find the moving image by using background subtraction algorithm the moving frame is identified. Then, by the threshold value the movement of the frame is identified and tracked. Hence, the movement of the object is identified accurately. This paper deals with low-cost intelligent mobile phone-based wireless video surveillance solution using moving object recognition technology. The proposed solution can be useful in various security systems and environmental surveillance. The fundamental rule of moving object detecting is given in the paper, then, a self-adaptive background representation that can update automatically and timely to adapt to the slow and slight changes of normal surroundings is detailed. While the subtraction of the present captured image and the background reaches a certain threshold, a moving object is measured to be in the current view, and the mobile phone will automatically notify the central control unit or the user through SMS (Short Message System). The main advantage of this system is when an unknown image is captured by the system it will alert the user automatically by sending an SMS to user’s mobile.

Keywords: security, graphical password, persuasive cued click points

Procedia PDF Downloads 519
2851 Non-Uniform Filter Banks-based Minimum Distance to Riemannian Mean Classifition in Motor Imagery Brain-Computer Interface

Authors: Ping Tan, Xiaomeng Su, Yi Shen

Abstract:

The motion intention in the motor imagery braincomputer interface is identified by classifying the event-related desynchronization (ERD) and event-related synchronization ERS characteristics of sensorimotor rhythm (SMR) in EEG signals. When the subject imagines different limbs or different parts moving, the rhythm components and bandwidth will change, which varies from person to person. How to find the effective sensorimotor frequency band of subjects is directly related to the classification accuracy of brain-computer interface. To solve this problem, this paper proposes a Minimum Distance to Riemannian Mean Classification method based on Non-Uniform Filter Banks. During the training phase, the EEG signals are decomposed into multiple different bandwidt signals by using multiple band-pass filters firstly; Then the spatial covariance characteristics of each frequency band signal are computered to be as the feature vectors. these feature vectors will be classified by the MDRM (Minimum Distance to Riemannian Mean) method, and cross validation is employed to obtain the effective sensorimotor frequency bands. During the test phase, the test signals are filtered by the bandpass filter of the effective sensorimotor frequency bands, and the extracted spatial covariance feature vectors will be classified by using the MDRM. Experiments on the BCI competition IV 2a dataset show that the proposed method is superior to other classification methods.

Keywords: non-uniform filter banks, motor imagery, brain-computer interface, minimum distance to Riemannian mean

Procedia PDF Downloads 93
2850 Low Pricing Strategy of Forest Products in Community Forestry Program: Subsidy to the Forest Users or Loss of Economy?

Authors: Laxuman Thakuri

Abstract:

Community-based forest management is often glorified as one of the best forest management alternatives in the developing countries like Nepal. It is also believed that the transfer of forest management authorities to local communities is decisive to take efficient decisions, maximize the forest benefits and improve the people’s livelihood. The community forestry of Nepal also aims to maximize the forest benefits; share them among the user households and improve their livelihood. However, how the local communities fix the price of forest products and local pricing made by the forest user groups affects to equitable forest benefits-sharing among the user households and their livelihood improvement objectives, the answer is largely silent among the researchers and policy-makers alike. This study examines local pricing system of forest products in the lowland community forestry and its effects on equitable benefit-sharing and livelihood improvement objectives. The study discovered that forest user groups fixed the price of forest products based on three criteria: i) costs incur in harvesting, ii) office operation costs, and iii) livelihood improvement costs through community development and income generating activities. Since user households have heterogeneous socio-economic conditions, the forest user groups have been applied low pricing strategy even for high-value forest products that the access of socio-economically worse-off households can be increased. However, the results of forest products distribution showed that as a result of low pricing strategy the access of socio-economically better-off households has been increasing at higher rate than worse-off and an inequality situation has been created. Similarly, the low pricing strategy is also found defective to livelihood improvement objectives. The study suggests for revising the forest products pricing system in community forest management and reforming the community forestry policy as well.

Keywords: community forestry, forest products pricing, equitable benefit-sharing, livelihood improvement, Nepal

Procedia PDF Downloads 280
2849 Performance Analysis of Heterogeneous Cellular Networks with Multiple Connectivity

Authors: Sungkyung Kim, Jee-Hyeon Na, Dong-Seung Kwon

Abstract:

Future mobile networks following 5th generation will be characterized by one thousand times higher gains in capacity; connections for at least one hundred billion devices; user experience capable of extremely low latency and response times. To be close to the capacity requirements and higher reliability, advanced technologies have been studied, such as multiple connectivity, small cell enhancement, heterogeneous networking, and advanced interference and mobility management. This paper is focused on the multiple connectivity in heterogeneous cellular networks. We investigate the performance of coverage and user throughput in several deployment scenarios. Using the stochastic geometry approach, the SINR distributions and the coverage probabilities are derived in case of dual connection. Also, to compare the user throughput enhancement among the deployment scenarios, we calculate the spectral efficiency and discuss our results.

Keywords: heterogeneous networks, multiple connectivity, small cell enhancement, stochastic geometry

Procedia PDF Downloads 303
2848 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 216
2847 The Design of Information Technology System for Traceability of Thailand’s Tubtimjun Roseapple

Authors: Pimploi Tirastittam, Phutthiwat Waiyawuththanapoom, Sawanath Treesathon

Abstract:

As there are several countries which import agriculture product from Thailand, those countries demand Thailand to establish the traceability system. The traceability system is the tool to reduce the risk in the supply chain in a very effective way as it will help the stakeholder in the supply chain to identify the defect point which will reduce the cost of operation in the supply chain. This research is aimed to design the traceability system for Tubtimjun roseapple for exporting to China, and it is the qualitative research. The data was collected from the expert in the tuntimjun roseapple and fruit exporting industry, and the data was used to design the traceability system. The design of the tubtimjun roseapple traceability system was followed the theory of supply chain which starts from the upstream of the supply chain to the downstream of the supply chain to support the process and condition of the exporting which included the database designing, system architecture, user interface design and information technology of the traceability system.

Keywords: design information, technology system, traceability, tubtimjun roseapple

Procedia PDF Downloads 147