Search results for: tangible user interface
2397 Use of FWD in Determination of Bonding Condition of Semi-Rigid Asphalt Pavement
Authors: Nonde Lushinga, Jiang Xin, Danstan Chiponde, Lawrence P. Mutale
Abstract:
In this paper, falling weight deflectometer (FWD) was used to determine the bonding condition of a newly constructed semi-rigid base pavement. Using Evercal back-calculation computer programme, it was possible to quickly and accurately determine the structural condition of the pavement system of FWD test data. The bonding condition of the pavement layers was determined from calculated shear stresses and strains (relative horizontal displacements) on the interface of pavement layers from BISAR 3.0 pavement computer programmes. Thus, by using non-linear layered elastic theory, a pavement structure is analysed in the same way as other civil engineering structures. From non-destructive FWD testing, the required bonding condition of pavement layers was quantified from soundly based principles of Goodman’s constitutive models shown in equation 2, thereby producing the shear reaction modulus (Ks) which gives an indication of bonding state of pavement layers. Furthermore, a Tack coat failure Ratio (TFR) which has long being used in the USA in pavement evaluation was also used in the study in order to give validity to the study. According to research [39], the interface between two asphalt layers is determined by use of Tack Coat failure Ratio (TFR) which is the ratio of the stiffness of top layer asphalt layers over the stiffness of the second asphalt layer (E1/E2) in a slipped pavement. TFR gives an indication of the strength of the tack coat which is the main determinants of interlayer slipping. The criteria is that if the interface was in the state full bond, TFR would be greater or equals to 1 and that if the TFR was 0, meant full slip. Results of the calculations showed that TFR value was 1.81 which re-affirmed the position that the pavement under study was in the state of full bond because the value was greater than 1. It was concluded that FWD can be used to determine bonding condition of existing and newly constructed pavements.Keywords: falling weight deflectometer (FWD), backcaluclation, semi-rigid base pavement, shear reaction modulus
Procedia PDF Downloads 5142396 Modelling the Tensile Behavior of Plasma Sprayed Freestanding Yttria Stabilized Zirconia Coatings
Authors: Supriya Patibanda, Xiaopeng Gong, Krishna N. Jonnalagadda, Ralph Abrahams
Abstract:
Yttria stabilized zirconia (YSZ) is used as a top coat in thermal barrier coatings in high-temperature turbine/jet engine applications. The mechanical behaviour of YSZ depends on the microstructural features like crack density and porosity, which are a result of coating method. However, experimentally ascertaining their individual effect is difficult due to the inherent challenges involved like material synthesis and handling. The current work deals with the development of a phenomenological model to replicate the tensile behavior of air plasma sprayed YSZ obtained from experiments. Initially, uniaxial tensile experiments were performed on freestanding YSZ coatings of ~300 µm thick for different crack densities and porosities. The coatings exhibited a nonlinear behavior and also a huge variation in strength values. With the obtained experimental tensile curve as a base and crack density and porosity as prime variables, a phenomenological model was developed using ABAQUS interface with new user material defined employing VUMAT sub routine. The relation between the tensile stress and the crack density was empirically established. Further, a parametric study was carried out to investigate the effect of the individual features on the non-linearity in these coatings. This work enables to generate new coating designs by varying the key parameters and predicting the mechanical properties with the help of a simulation, thereby minimizing experiments.Keywords: crack density, finite element method, plasma sprayed coatings, VUMAT
Procedia PDF Downloads 1482395 Load Transfer of Steel Pipe Piles in Warming Permafrost
Authors: S. Amirhossein Tabatabaei, Abdulghader A. Aldaeef, Mohammad T. Rayhani
Abstract:
As the permafrost continues to melt in the northern regions due to global warming, a soil-water mixture is left behind with drastically lower strength; a phenomenon that directly impacts the resilience of existing structures and infrastructure systems. The frozen soil-structure interaction, which in ice-poor soils is controlled by both interface shear and ice-bonding, changes its nature into a sole frictional state. Adfreeze, the controlling mechanism in frozen soil-structure interaction, diminishes as the ground temperature approaches zero. The main purpose of this paper is to capture the altered behaviour of frozen interface with respect to rising temperature, especially near melting states. A series of pull-out tests are conducted on model piles inside a cold room to study how the strength parameters are influenced by the phase change in ice-poor soils. Steel model piles, embedded in artificially frozen cohesionless soil, are subjected to both sustained pull-out forces and constant rates of displacement to observe the creep behaviour and acquire load-deformation curves, respectively. Temperature, as the main variable of interest, is increased from a lower limit of -10°C up to the point of melting. During different stages of the temperature rise, both skin deformations and temperatures are recorded at various depths along the pile shaft. Significant reduction of pullout capacity and accelerated creep behaviour is found to be the primary consequences of rising temperature. By investigating the different pull-out capacities and deformations measured during step-wise temperature change, characteristics of the transition from frozen to unfrozen soil-structure interaction are studied.Keywords: Adfreeze, frozen soil-structure interface, ice-poor soils, pull-out capacity, warming permafrost
Procedia PDF Downloads 1112394 Evaluation of Gesture-Based Password: User Behavioral Features Using Machine Learning Algorithms
Authors: Lakshmidevi Sreeramareddy, Komalpreet Kaur, Nane Pothier
Abstract:
Graphical-based passwords have existed for decades. Their major advantage is that they are easier to remember than an alphanumeric password. However, their disadvantage (especially recognition-based passwords) is the smaller password space, making them more vulnerable to brute force attacks. Graphical passwords are also highly susceptible to the shoulder-surfing effect. The gesture-based password method that we developed is a grid-free, template-free method. In this study, we evaluated the gesture-based passwords for usability and vulnerability. The results of the study are significant. We developed a gesture-based password application for data collection. Two modes of data collection were used: Creation mode and Replication mode. In creation mode (Session 1), users were asked to create six different passwords and reenter each password five times. In replication mode, users saw a password image created by some other user for a fixed duration of time. Three different duration timers, such as 5 seconds (Session 2), 10 seconds (Session 3), and 15 seconds (Session 4), were used to mimic the shoulder-surfing attack. After the timer expired, the password image was removed, and users were asked to replicate the password. There were 74, 57, 50, and 44 users participated in Session 1, Session 2, Session 3, and Session 4 respectfully. In this study, the machine learning algorithms have been applied to determine whether the person is a genuine user or an imposter based on the password entered. Five different machine learning algorithms were deployed to compare the performance in user authentication: namely, Decision Trees, Linear Discriminant Analysis, Naive Bayes Classifier, Support Vector Machines (SVMs) with Gaussian Radial Basis Kernel function, and K-Nearest Neighbor. Gesture-based password features vary from one entry to the next. It is difficult to distinguish between a creator and an intruder for authentication. For each password entered by the user, four features were extracted: password score, password length, password speed, and password size. All four features were normalized before being fed to a classifier. Three different classifiers were trained using data from all four sessions. Classifiers A, B, and C were trained and tested using data from the password creation session and the password replication with a timer of 5 seconds, 10 seconds, and 15 seconds, respectively. The classification accuracies for Classifier A using five ML algorithms are 72.5%, 71.3%, 71.9%, 74.4%, and 72.9%, respectively. The classification accuracies for Classifier B using five ML algorithms are 69.7%, 67.9%, 70.2%, 73.8%, and 71.2%, respectively. The classification accuracies for Classifier C using five ML algorithms are 68.1%, 64.9%, 68.4%, 71.5%, and 69.8%, respectively. SVMs with Gaussian Radial Basis Kernel outperform other ML algorithms for gesture-based password authentication. Results confirm that the shorter the duration of the shoulder-surfing attack, the higher the authentication accuracy. In conclusion, behavioral features extracted from the gesture-based passwords lead to less vulnerable user authentication.Keywords: authentication, gesture-based passwords, machine learning algorithms, shoulder-surfing attacks, usability
Procedia PDF Downloads 1072393 Exploratory Study of Individual User Characteristics That Predict Attraction to Computer-Mediated Social Support Platforms and Mental Health Apps
Authors: Rachel Cherner
Abstract:
Introduction: The current study investigates several user characteristics that may predict the adoption of digital mental health supports. The extent to which individual characteristics predict preferences for functional elements of computer-mediated social support (CMSS) platforms and mental health (MH) apps is relatively unstudied. Aims: The present study seeks to illuminate the relationship between broad user characteristics and perceived attraction to CMSS platforms and MH apps. Methods: Participants (n=353) were recruited using convenience sampling methods (i.e., digital flyers, email distribution, and online survey forums). The sample was 68% male, and 32% female, with a mean age of 29. Participant racial and ethnic breakdown was 75% White, 7%, 5% Asian, and 5% Black or African American. Participants were asked to complete a 25-minute self-report questionnaire that included empirically validated measures assessing a battery of characteristics (i.e., subjective levels of anxiety/depression via PHQ-9 (Patient Health Questionnaire 9-item) and GAD-7 (Generalized Anxiety Disorder 7-item); attachment style via MAQ (Measure of Attachment Qualities); personality types via TIPI (The 10-Item Personality Inventory); growth mindset and mental health-seeking attitudes via GM (Growth Mindset Scale) and MHSAS (Mental Help Seeking Attitudes Scale)) and subsequent attitudes toward CMSS platforms and MH apps. Results: A stepwise linear regression was used to test if user characteristics significantly predicted attitudes towards key features of CMSS platforms and MH apps. The overall regression was statistically significant (R² =.20, F(1,344)=14.49, p<.000). Conclusion: This original study examines the clinical and sociocultural factors influencing decisions to use CMSS platforms and MH apps. Findings provide valuable insight for increasing adoption and engagement with digital mental health support. Fostering a growth mindset may be a method of increasing participant/patient engagement. In addition, CMSS platforms and MH apps may empower under-resourced and minority groups to gain basic access to mental health support. We do not assume this final model contains the best predictors of use; this is merely a preliminary step toward understanding the psychology and attitudes of CMSS platform/MH app users.Keywords: computer-mediated social support platforms, digital mental health, growth mindset, health-seeking attitudes, mental health apps, user characteristics
Procedia PDF Downloads 922392 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level
Authors: Pedro M. Abreu, Bruno R. Mendes
Abstract:
The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.Keywords: clinical pharmacy, co-payments, healthcare, medicines
Procedia PDF Downloads 2512391 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.Keywords: floods, FLIKE, probability distributions, flood frequency, outlier
Procedia PDF Downloads 4502390 Assessment of the Interface Strength between High-Density Polyethylene Geomembrane and Expanded Polystyrene by the Direct Shear Test
Authors: Sergio Luiz da Costa Junior, Carolina Fofonka Palomino, Paulo Cesar Lodi
Abstract:
The use of light landfills is an effective solution for road works in soft ground sites, such as Rio de Janeiro (RJ) and Santos (SP) - the Southeastern Brazilian coast. The technique consists in replacing the topsoil by expandable polystyrene (EPS) geofoam, lined with geomembrane to prevent the attack of chemical products.Thus, knowing the interface shear strength of those materials is important in projects to avoid rupturing the system. The purpose of this paper is to compare the shear strength in the geomembrane-EPS interfaces by the direct shear test. The tests were performed under the dry and saturated condition, and four kind of high-density polyethylene (HDPE) 2,00mm geomembranes were used, smooth and texturized - manufactured in the flat die and blown film process. It was found that the shear strength is directly influenced by the roughness of the geomembrane, showed higher friction angle in the textured geomembrane. The direct shear test, in the saturated condition, also showed smaller friction angle than the now-wetted test.Keywords: geofoam, geomembrane, soft ground, strength shear
Procedia PDF Downloads 3152389 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection
Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay
Abstract:
With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey
Procedia PDF Downloads 1212388 Exploring the Relationship Between Past and Present Reviews: The Influence of User Generated Content on Future Hotel Guest Experience Perceptions
Authors: Sacha Joseph-Mathews, Leili Javadpour
Abstract:
In the tourism industry, hoteliers spend millions annually on marketing and positioning efforts for their respective hotels, all in an effort to create a specific image in the minds of the consumer. Yet despite extensive efforts to seduce potential hotel guests with sophisticated advertising messages generated by hotel entities, consumers continue to mistrust corporate branding, preferring instead to place their trust in the reviews of their consumer peers. In today’s complex and cluttered marketplace, online reviews can serve as a mediator for consumers who do not have actual knowledge and experiences with the brand, but are in the process of deciding whether or not to engage in a consumption exercise. Traditionally, consumers have used online reviews as a source of comfort and confirmation of a product/service’s positioning. But today, very few customers make any purchase decisions without first researching existing user reviews, making reviews more of a necessity, rather than a luxury in the purchase decision process. The influence of user generated content (UGC) is amplified in the tourism industry; as more than a third of potential hotel guests will not book a room without first reading a review. As corporate branding becomes less relevant and online reviews become more important, how much of the consumer’s stay expectations are being dictated by existing UGC? Moreover, as hotel guest experience a hotel through the lens of an existing review, how much of their stay and in turn their review, would have been influenced by those reviews that they read? Ultimately, there is the potential for UGC to dictate what potential guests will be most critical about, and or most focused on during their stay. If UGC is a stronger influencer in the purchase decision process than corporate branding, doesn’t it have the potential to dictate, the entire stay experience by influencing the expectations of the guest prior to them arriving on the property? For example, if a hotel is an eco-destination and they focus their branding on their website around sustainability and the retreat nature of the hotel. Yet, guest reviews constantly discuss how dissatisfactory the service and food was with no mention of nature or sustainability, will future reviews then focus primarily on the food? Using text analysis software to examine over 25,000 online reviews, we explore the extent to which new reviews are influenced by wording used in previous reviews for a hotel property, versus content generated by corporate positioning. Additionally, we investigate how distinct hotel related UGC is across different types of tourism destinations. Our findings suggest that UGC can have a greater impact on future reviews, than corporate branding and there is more cohesiveness across UGC of different types of hotel properties than anticipated. A model of User Generated Content Influence is presented and the managerial impact of the power of online reviews to trump corporate branding and shape future user experiences is discussed.Keywords: user generated content, UGC, corporate branding, online reviews, hotels and tourism
Procedia PDF Downloads 942387 The Effective Use of the Network in the Distributed Storage
Authors: Mamouni Mohammed Dhiya Eddine
Abstract:
This work aims at studying the exploitation of high-speed networks of clusters for distributed storage. Parallel applications running on clusters require both high-performance communications between nodes and efficient access to the storage system. Many studies on network technologies led to the design of dedicated architectures for clusters with very fast communications between computing nodes. Efficient distributed storage in clusters has been essentially developed by adding parallelization mechanisms so that the server(s) may sustain an increased workload. In this work, we propose to improve the performance of distributed storage systems in clusters by efficiently using the underlying high-performance network to access distant storage systems. The main question we are addressing is: do high-speed networks of clusters fit the requirements of a transparent, efficient and high-performance access to remote storage? We show that storage requirements are very different from those of parallel computation. High-speed networks of clusters were designed to optimize communications between different nodes of a parallel application. We study their utilization in a very different context, storage in clusters, where client-server models are generally used to access remote storage (for instance NFS, PVFS or LUSTRE). Our experimental study based on the usage of the GM programming interface of MYRINET high-speed networks for distributed storage raised several interesting problems. Firstly, the specific memory utilization in the storage access system layers does not easily fit the traditional memory model of high-speed networks. Secondly, client-server models that are used for distributed storage have specific requirements on message control and event processing, which are not handled by existing interfaces. We propose different solutions to solve communication control problems at the filesystem level. We show that a modification of the network programming interface is required. Data transfer issues need an adaptation of the operating system. We detail several propositions for network programming interfaces which make their utilization easier in the context of distributed storage. The integration of a flexible processing of data transfer in the new programming interface MYRINET/MX is finally presented. Performance evaluations show that its usage in the context of both storage and other types of applications is easy and efficient.Keywords: distributed storage, remote file access, cluster, high-speed network, MYRINET, zero-copy, memory registration, communication control, event notification, application programming interface
Procedia PDF Downloads 2192386 A Meaning-Making Approach to Understand the Relationship between the Physical Built Environment of the Heritage Sites including the Intangible Values and the Design Development of the Public Open Spaces: Case Study Liverpool Pier Head
Authors: May Newisar, Richard Kingston, Philip Black
Abstract:
Heritage-led regeneration developments have been considered as one of the cornerstones of the economic and social revival of historic towns and cities in the UK. However, this approach has proved its deficiency within the development of Liverpool World Heritage site. This is due to the conflict between sustaining the tangible and intangible values as well as achieving the aimed economic developments. Accordingly, the development of such areas is influenced by a top-down approach which considers heritage as consumable experience and urban regeneration as the economic development for it. This neglects the heritage sites characteristics and values as well as the design criteria for public open spaces that overlap with the heritage sites. Currently, knowledge regarding the relationship between the physical built environment of the heritage sites including the intangible values and the design development of the public open spaces is limited. Public open spaces have been studied from different perspectives such as increasing walkability, a source of social cohesion, provide a good quality of life as well as understanding users’ perception. While heritage sites have been discussed heavily on how to maintain the physical environment, understanding the courses of threats and how to be protected. In addition to users’ experiences and motivations of visiting such areas. Furthermore, new approaches tried to overcome the gap such as the historic urban landscape approach. This approach is focusing on the entire human environment with all its tangible and intangible qualities. However, this research aims to understand the relationship between the heritage sites and public open spaces and how the overlap of the design and development of both could be used as a quality to enhance the heritage sites and improve users’ experience. A meaning-making approach will be used in order to understand and articulate how the development of Liverpool World Heritage site and its value could influence and shape the design of public open space Pier Head in order to attract a different level of tourists to be used as a tool for economic development. Consequently, this will help in bridging the gap between the planning and conservation areas’ policies through an understanding of how flexible is the system in order to adopt alternative approaches for the design and development strategies for those areas.Keywords: historic urban landscape, environmental psychology, urban governance, identity
Procedia PDF Downloads 1312385 Social Media Data Analysis for Personality Modelling and Learning Styles Prediction Using Educational Data Mining
Authors: Srushti Patil, Preethi Baligar, Gopalkrishna Joshi, Gururaj N. Bhadri
Abstract:
In designing learning environments, the instructional strategies can be tailored to suit the learning style of an individual to ensure effective learning. In this study, the information shared on social media like Facebook is being used to predict learning style of a learner. Previous research studies have shown that Facebook data can be used to predict user personality. Users with a particular personality exhibit an inherent pattern in their digital footprint on Facebook. The proposed work aims to correlate the user's’ personality, predicted from Facebook data to the learning styles, predicted through questionnaires. For Millennial learners, Facebook has become a primary means for information sharing and interaction with peers. Thus, it can serve as a rich bed for research and direct the design of learning environments. The authors have conducted this study in an undergraduate freshman engineering course. Data from 320 freshmen Facebook users was collected. The same users also participated in the learning style and personality prediction survey. The Kolb’s Learning style questionnaires and Big 5 personality Inventory were adopted for the survey. The users have agreed to participate in this research and have signed individual consent forms. A specific page was created on Facebook to collect user data like personal details, status updates, comments, demographic characteristics and egocentric network parameters. This data was captured by an application created using Python program. The data captured from Facebook was subjected to text analysis process using the Linguistic Inquiry and Word Count dictionary. An analysis of the data collected from the questionnaires performed reveals individual student personality and learning style. The results obtained from analysis of Facebook, learning style and personality data were then fed into an automatic classifier that was trained by using the data mining techniques like Rule-based classifiers and Decision trees. This helps to predict the user personality and learning styles by analysing the common patterns. Rule-based classifiers applied for text analysis helps to categorize Facebook data into positive, negative and neutral. There were totally two models trained, one to predict the personality from Facebook data; another one to predict the learning styles from the personalities. The results show that the classifier model has high accuracy which makes the proposed method to be a reliable one for predicting the user personality and learning styles.Keywords: educational data mining, Facebook, learning styles, personality traits
Procedia PDF Downloads 2312384 Stable Diffusion, Context-to-Motion Model to Augmenting Dexterity of Prosthetic Limbs
Authors: André Augusto Ceballos Melo
Abstract:
Design to facilitate the recognition of congruent prosthetic movements, context-to-motion translations guided by image, verbal prompt, users nonverbal communication such as facial expressions, gestures, paralinguistics, scene context, and object recognition contributes to this process though it can also be applied to other tasks, such as walking, Prosthetic limbs as assistive technology through gestures, sound codes, signs, facial, body expressions, and scene context The context-to-motion model is a machine learning approach that is designed to improve the control and dexterity of prosthetic limbs. It works by using sensory input from the prosthetic limb to learn about the dynamics of the environment and then using this information to generate smooth, stable movements. This can help to improve the performance of the prosthetic limb and make it easier for the user to perform a wide range of tasks. There are several key benefits to using the context-to-motion model for prosthetic limb control. First, it can help to improve the naturalness and smoothness of prosthetic limb movements, which can make them more comfortable and easier to use for the user. Second, it can help to improve the accuracy and precision of prosthetic limb movements, which can be particularly useful for tasks that require fine motor control. Finally, the context-to-motion model can be trained using a variety of different sensory inputs, which makes it adaptable to a wide range of prosthetic limb designs and environments. Stable diffusion is a machine learning method that can be used to improve the control and stability of movements in robotic and prosthetic systems. It works by using sensory feedback to learn about the dynamics of the environment and then using this information to generate smooth, stable movements. One key aspect of stable diffusion is that it is designed to be robust to noise and uncertainty in the sensory feedback. This means that it can continue to produce stable, smooth movements even when the sensory data is noisy or unreliable. To implement stable diffusion in a robotic or prosthetic system, it is typically necessary to first collect a dataset of examples of the desired movements. This dataset can then be used to train a machine learning model to predict the appropriate control inputs for a given set of sensory observations. Once the model has been trained, it can be used to control the robotic or prosthetic system in real-time. The model receives sensory input from the system and uses it to generate control signals that drive the motors or actuators responsible for moving the system. Overall, the use of the context-to-motion model has the potential to significantly improve the dexterity and performance of prosthetic limbs, making them more useful and effective for a wide range of users Hand Gesture Body Language Influence Communication to social interaction, offering a possibility for users to maximize their quality of life, social interaction, and gesture communication.Keywords: stable diffusion, neural interface, smart prosthetic, augmenting
Procedia PDF Downloads 1012383 Web-Based Cognitive Writing Instruction (WeCWI): A Theoretical-and-Pedagogical e-Framework for Language Development
Authors: Boon Yih Mah
Abstract:
Web-based Cognitive Writing Instruction (WeCWI)’s contribution towards language development can be divided into linguistic and non-linguistic perspectives. In linguistic perspective, WeCWI focuses on the literacy and language discoveries, while the cognitive and psychological discoveries are the hubs in non-linguistic perspective. In linguistic perspective, WeCWI draws attention to free reading and enterprises, which are supported by the language acquisition theories. Besides, the adoption of process genre approach as a hybrid guided writing approach fosters literacy development. Literacy and language developments are interconnected in the communication process; hence, WeCWI encourages meaningful discussion based on the interactionist theory that involves input, negotiation, output, and interactional feedback. Rooted in the e-learning interaction-based model, WeCWI promotes online discussion via synchronous and asynchronous communications, which allows interactions happened among the learners, instructor, and digital content. In non-linguistic perspective, WeCWI highlights on the contribution of reading, discussion, and writing towards cognitive development. Based on the inquiry models, learners’ critical thinking is fostered during information exploration process through interaction and questioning. Lastly, to lower writing anxiety, WeCWI develops the instructional tool with supportive features to facilitate the writing process. To bring a positive user experience to the learner, WeCWI aims to create the instructional tool with different interface designs based on two different types of perceptual learning style.Keywords: WeCWI, literacy discovery, language discovery, cognitive discovery, psychological discovery
Procedia PDF Downloads 5612382 Intersections and Cultural Landscape Interpretation, in the Case of Ancient Messene in the Peloponnese
Authors: E. Maistrou, P. Themelis, D. Kosmopoulos, K. Boulougoura, A. M. Konidi, K. Moretti
Abstract:
InterArch is an ongoing research project that is running since September 2020 and aims to propose a digital application for the enhancement of the cultural landscape, which emphasizes the contribution of physical space and time in digital data organization. The research case study refers to Ancient Messene in the Peloponnese, one of the most important archaeological sites in Greece. The project integrates an interactive approach to the natural environment, aiming at a manifold sensory experience. It combines the physical space of the archaeological site with the digital space of archaeological and cultural data while, at the same time, it embraces storytelling processes by engaging an interdisciplinary approach that familiarizes the user to multiple semantic interpretations. The research project is co‐financed by the European Union and Greek national funds, through the Operational Program Competitiveness, Entrepreneurship, and Innovation, under the call RESEARCH - CREATE – INNOVATE (project code: Τ2ΕΔΚ-01659). It involves mutual collaboration between academic and cultural institutions and the contribution of an IT applications development company. New technologies and the integration of digital data enable the implementation of non‐linear narratives related to the representational characteristics of the art of collage. Various images (photographs, drawings, etc.) and sounds (narrations, music, soundscapes, audio signs, etc.) could be presented according to our proposal through new semiotics of augmented and virtual reality technologies applied in touch screens and smartphones. Despite the fragmentation of tangible or intangible references, material landscape formations, including archaeological remains, constitute the common ground that can inspire cultural narratives in a process that unfolds personal perceptions and collective imaginaries. It is in this context that cultural landscape may be considered an indication of space and historical continuity. It is in this context that history could emerge, according to our proposal, not solely as a previous inscription but also as an actual happening. As a rhythm of occurrences suggesting mnemonic references and, moreover, evolving history projected on the contemporary ongoing cultural landscape.Keywords: cultural heritage, digital data, landscape, archaeological sites, visitors’ itineraries
Procedia PDF Downloads 802381 Design On Demand (DoD): Spiral Model of The Lifecycle of Products in The Personal 3D-Printed Products' Market
Authors: Zuk Nechemia Turbovich
Abstract:
This paper introduces DoD, a contextual spiral model that describes the lifecycle of products intended for manufacturing using Personal 3D Printers (P3DP). The study is based on a review of the desktop P3DPs market that shows that the combination of digital connectivity, coupled with the potential ownership of P3DP by home users, is radically changing the form of the product lifecycle, comparatively to familiar lifecycle paradigms. The paper presents the change in the design process, considering the characterization of product types in the P3DP market and the possibility of having a direct dialogue between end-user and product designers. The model, as an updated paradigm, provides a strategic perspective on product design and tools for success, understanding that design is subject to rapid and continuous improvement and that products are subject to repair, update, and customization. The paper will include a review of real cases.Keywords: lifecycle, mass-customization, personal 3d-printing, user involvement
Procedia PDF Downloads 1832380 A Bottleneck-Aware Power Management Scheme in Heterogeneous Processors for Web Apps
Authors: Inyoung Park, Youngjoo Woo, Euiseong Seo
Abstract:
With the advent of WebGL, Web apps are now able to provide high quality graphics by utilizing the underlying graphic processing units (GPUs). Despite that the Web apps are becoming common and popular, the current power management schemes, which were devised for the conventional native applications, are suboptimal for Web apps because of the additional layer, the Web browser, between OS and application. The Web browser running on a CPU issues GL commands, which are for rendering images to be displayed by the Web app currently running, to the GPU and the GPU processes them. The size and number of issued GL commands determine the processing load of the GPU. While the GPU is processing the GL commands, CPU simultaneously executes the other compute intensive threads. The actual user experience will be determined by either CPU processing or GPU processing depending on which of the two is the more demanded resource. For example, when the GPU work queue is saturated by the outstanding commands, lowering the performance level of the CPU does not affect the user experience because it is already deteriorated by the retarded execution of GPU commands. Consequently, it would be desirable to lower CPU or GPU performance level to save energy when the other resource is saturated and becomes a bottleneck in the execution flow. Based on this observation, we propose a power management scheme that is specialized for the Web app runtime environment. This approach incurs two technical challenges; identification of the bottleneck resource and determination of the appropriate performance level for unsaturated resource. The proposed power management scheme uses the CPU utilization level of the Window Manager to tell which one is the bottleneck if exists. The Window Manager draws the final screen using the processed results delivered from the GPU. Thus, the Window Manager is on the critical path that determines the quality of user experience and purely executed by the CPU. The proposed scheme uses the weighted average of the Window Manager utilization to prevent excessive sensitivity and fluctuation. We classified Web apps into three categories using the analysis results that measure frame-per-second (FPS) changes under diverse CPU/GPU clock combinations. The results showed that the capability of the CPU decides user experience when the Window Manager utilization is above 90% and consequently, the proposed scheme decreases the performance level of CPU by one step. On the contrary, when its utilization is less than 60%, the bottleneck usually lies in the GPU and it is desirable to decrease the performance of GPU. Even the processing unit that is not on critical path, excessive performance drop can occur and that may adversely affect the user experience. Therefore, our scheme lowers the frequency gradually, until it finds an appropriate level by periodically checking the CPU utilization. The proposed scheme reduced the energy consumption by 10.34% on average in comparison to the conventional Linux kernel, and it worsened their FPS by 1.07% only on average.Keywords: interactive applications, power management, QoS, Web apps, WebGL
Procedia PDF Downloads 1922379 Towards Learning Query Expansion
Authors: Ahlem Bouziri, Chiraz Latiri, Eric Gaussier
Abstract:
The steady growth in the size of textual document collections is a key progress-driver for modern information retrieval techniques whose effectiveness and efficiency are constantly challenged. Given a user query, the number of retrieved documents can be overwhelmingly large, hampering their efficient exploitation by the user. In addition, retaining only relevant documents in a query answer is of paramount importance for an effective meeting of the user needs. In this situation, the query expansion technique offers an interesting solution for obtaining a complete answer while preserving the quality of retained documents. This mainly relies on an accurate choice of the added terms to an initial query. Interestingly enough, query expansion takes advantage of large text volumes by extracting statistical information about index terms co-occurrences and using it to make user queries better fit the real information needs. In this respect, a promising track consists in the application of data mining methods to extract dependencies between terms, namely a generic basis of association rules between terms. The key feature of our approach is a better trade off between the size of the mining result and the conveyed knowledge. Thus, face to the huge number of derived association rules and in order to select the optimal combination of query terms from the generic basis, we propose to model the problem as a classification problem and solve it using a supervised learning algorithm such as SVM or k-means. For this purpose, we first generate a training set using a genetic algorithm based approach that explores the association rules space in order to find an optimal set of expansion terms, improving the MAP of the search results. The experiments were performed on SDA 95 collection, a data collection for information retrieval. It was found that the results were better in both terms of MAP and NDCG. The main observation is that the hybridization of text mining techniques and query expansion in an intelligent way allows us to incorporate the good features of all of them. As this is a preliminary attempt in this direction, there is a large scope for enhancing the proposed method.Keywords: supervised leaning, classification, query expansion, association rules
Procedia PDF Downloads 3242378 The Co-Simulation Interface SystemC/Matlab Applied in JPEG and SDR Application
Authors: Walid Hassairi, Moncef Bousselmi, Mohamed Abid
Abstract:
Functional verification is a major part of today’s system design task. Several approaches are available for verification on a high abstraction level, where designs are often modeled using MATLAB/Simulink. However, different approaches are a barrier to a unified verification flow. In this paper, we propose a co-simulation interface between SystemC and MATLAB and Simulink to enable functional verification of multi-abstraction levels designs. The resulting verification flow is tested on JPEG compression algorithm. The required synchronization of both simulation environments, as well as data type conversion is solved using the proposed co-simulation flow. We divided into two encoder jpeg parts. First implemented in SystemC which is the DCT is representing the HW part. Second, consisted of quantization and entropy encoding which is implemented in Matlab is the SW part. For communication and synchronization between these two parts we use S-Function and engine in Simulink matlab. With this research premise, this study introduces a new implementation of a Hardware SystemC of DCT. We compare the result of our simulation compared to SW / SW. We observe a reduction in simulation time you have 88.15% in JPEG and the design efficiency of the supply design is 90% in SDR.Keywords: hardware/software, co-design, co-simulation, systemc, matlab, s-function, communication, synchronization
Procedia PDF Downloads 4052377 Popular eReaders
Authors: Tom D. Gedeon, Ujala Rampaul
Abstract:
The evaluation of electronic consumer goods are most often done from the perspective of analysing the latest models, comparing their advantages and disadvantages with respect to price. This style of evaluation is often performed by one or a few product experts on a wide range of features that may not be applicable to each user. We instead used a scenario-based approach to evaluate a number of e-readers. The setting is similar to a user who is interested in a new product or technology and has allocated a limited budget. We evaluate the quality and usability of e-readers available within that budget range. This is based on the assumption of a rational market which prices older second hand devices the same as functionally equivalent new devices. We describe our evaluation and comparison of four branded eReaders, as the initial stage of a larger project. The scenario has a range of tasks approximating a busy person who does not bother to read the manual. We found that navigation within books to be the most significant differentiator between the eReaders in our scenario based evaluation process.Keywords: eReader, scenario based, price comparison, Kindle, Kobo, Nook, Sony, technology adoption
Procedia PDF Downloads 5302376 Meeting User’s Information Need: A Study on the Acceptance of Mobile Library Service at UGM Library
Authors: M. Fikriansyah Wicaksono, Rafael Arief Budiman, M. Very Setiawan
Abstract:
Currently, a wide range of innovative mobile library (M-Library) service is provided for the users in the library. The M-Library service is an innovation that aims to bring the collections of the library to users who currently use their smartphone so often. With M-Library services, it is expected that the users can fulfill their information needs more conveniently and practically. This study aims to find out how users use M-Library services provided by UGM library. This study applied a quantitative approach to investigate how to use the application M-Library. The Technology Acceptance Model (TAM) theory is applied to perform the analysis in terms of perceived usefulness, perceived ease of use, attitude towards behavior, behavioral intention and actual system usage. The results show that overall the users found that the M-Library application is useful to meet their information needs. Such as facilitate user to access e-resources, search UGM library collections, online booking collections, and reminder for returning book.Keywords: m-library, mobile library services, technology acceptance, library of UGM
Procedia PDF Downloads 2292375 A Study of Human Communication in an Internet Community
Authors: Andrew Laghos
Abstract:
The Internet is a big part of our everyday lives. People can now access the internet from a variety of places including home, college, and work. Many airports, hotels, restaurants and cafeterias, provide free wireless internet to their visitors. Using technologies like computers, tablets, and mobile phones, we spend a lot of our time online getting entertained, getting informed, and communicating with each other. This study deals with the latter part, namely, human communication through the Internet. People can communicate with each other using social media, social network sites (SNS), e-mail, messengers, chatrooms, and so on. By connecting with each other they form virtual communities. Regarding SNS, types of connections that can be studied include friendships and cliques. Analyzing these connections is important to help us understand online user behavior. The method of Social Network Analysis (SNA) was used on a case study, and results revealed the existence of some useful patterns of interactivity between the participants. The study ends with implications of the results and ideas for future research.Keywords: human communication, internet communities, online user behavior, psychology
Procedia PDF Downloads 4962374 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service
Authors: Lai Wenfang
Abstract:
Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.Keywords: artificial intelligence, natural language processing, machine learning, visualization
Procedia PDF Downloads 1742373 Detonating Culture, Statistic and Developmenet in Imo State of Nigeria
Authors: Ejikeme Ugiri
Abstract:
In an executive summary, UNESCO describes Framework for Cultural Statistics as a tool for organizing cultural statistics both nationally and internationally. This is based on conceptual foundation and a common understanding of culture that will enable the measurement of a wide range of cultural expressions. This means therefore that cultural expression in whatever guise has the potentiality of contributing reasonably to the development of a given society. The paper looked into the various tangible and intangible cultures in Imo State of Nigeria. Due to government’s insensitivity, there is need to remind ourselves of the need to pay adequate attention to the cultural heritage bequeathed to us by our forefathers for the sake of posterity. Documenting this information in written form therefore becomes imperative. The study concludes that culture if developed, could reasonably contribute to economic and social growth of the society.Keywords: culture, detonation, development, statistics
Procedia PDF Downloads 4672372 Efficient Manageability and Intelligent Classification of Web Browsing History Using Machine Learning
Authors: Suraj Gururaj, Sumantha Udupa U.
Abstract:
Browsing the Web has emerged as the de facto activity performed on the Internet. Although browsing gets tracked, the manageability aspect of Web browsing history is very poor. In this paper, we have a workable solution implemented by using machine learning and natural language processing techniques for efficient manageability of user’s browsing history. The significance of adding such a capability to a Web browser is that it ensures efficient and quick information retrieval from browsing history, which currently is very challenging. Our solution guarantees that any important websites visited in the past can be easily accessible because of the intelligent and automatic classification. In a nutshell, our solution-based paper provides an implementation as a browser extension by intelligently classifying the browsing history into most relevant category automatically without any user’s intervention. This guarantees no information is lost and increases productivity by saving time spent revisiting websites that were of much importance.Keywords: adhoc retrieval, Chrome extension, supervised learning, tile, Web personalization
Procedia PDF Downloads 3762371 Deep Reinforcement Learning with Leonard-Ornstein Processes Based Recommender System
Authors: Khalil Bachiri, Ali Yahyaouy, Nicoleta Rogovschi
Abstract:
Improved user experience is a goal of contemporary recommender systems. Recommender systems are starting to incorporate reinforcement learning since it easily satisfies this goal of increasing a user’s reward every session. In this paper, we examine the most effective Reinforcement Learning agent tactics on the Movielens (1M) dataset, balancing precision and a variety of recommendations. The absence of variability in final predictions makes simplistic techniques, although able to optimize ranking quality criteria, worthless for consumers of the recommendation system. Utilizing the stochasticity of Leonard-Ornstein processes, our suggested strategy encourages the agent to investigate its surroundings. Research demonstrates that raising the NDCG (Discounted Cumulative Gain) and HR (HitRate) criterion without lowering the Ornstein-Uhlenbeck process drift coefficient enhances the diversity of suggestions.Keywords: recommender systems, reinforcement learning, deep learning, DDPG, Leonard-Ornstein process
Procedia PDF Downloads 1422370 Analyzing the Market Growth in Application Programming Interface Economy Using Time-Evolving Model
Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata
Abstract:
API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding the dynamics of a market of API economy under the strategies of participants is crucial to fully maximize the values of the API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competition, and the profit of the platform increases.Keywords: API economy, ecosystem, platform, API providers
Procedia PDF Downloads 912369 Medical Imaging Fusion: A Teaching-Learning Simulation Environment
Authors: Cristina Maria Ribeiro Martins Pereira Caridade, Ana Rita Ferreira Morais
Abstract:
The use of computational tools has become essential in the context of interactive learning, especially in engineering education. In the medical industry, teaching medical image processing techniques is a crucial part of training biomedical engineers, as it has integrated applications with healthcare facilities and hospitals. The aim of this article is to present a teaching-learning simulation tool developed in MATLAB using a graphical user interface for medical image fusion that explores different image fusion methodologies and processes in combination with image pre-processing techniques. The application uses different algorithms and medical fusion techniques in real time, allowing you to view original images and fusion images, compare processed and original images, adjust parameters, and save images. The tool proposed in an innovative teaching and learning environment consists of a dynamic and motivating teaching simulation for biomedical engineering students to acquire knowledge about medical image fusion techniques and necessary skills for the training of biomedical engineers. In conclusion, the developed simulation tool provides real-time visualization of the original and fusion images and the possibility to test, evaluate and progress the student’s knowledge about the fusion of medical images. It also facilitates the exploration of medical imaging applications, specifically image fusion, which is critical in the medical industry. Teachers and students can make adjustments and/or create new functions, making the simulation environment adaptable to new techniques and methodologies.Keywords: image fusion, image processing, teaching-learning simulation tool, biomedical engineering education
Procedia PDF Downloads 1312368 A Brain Controlled Robotic Gait Trainer for Neurorehabilitation
Authors: Qazi Umer Jamil, Abubakr Siddique, Mubeen Ur Rehman, Nida Aziz, Mohsin I. Tiwana
Abstract:
This paper discusses a brain controlled robotic gait trainer for neurorehabilitation of Spinal Cord Injury (SCI) patients. Patients suffering from Spinal Cord Injuries (SCI) become unable to execute motion control of their lower proximities due to degeneration of spinal cord neurons. The presented approach can help SCI patients in neuro-rehabilitation training by directly translating patient motor imagery into walkers motion commands and thus bypassing spinal cord neurons completely. A non-invasive EEG based brain-computer interface is used for capturing patient neural activity. For signal processing and classification, an open source software (OpenVibe) is used. Classifiers categorize the patient motor imagery (MI) into a specific set of commands that are further translated into walker motion commands. The robotic walker also employs fall detection for ensuring safety of patient during gait training and can act as a support for SCI patients. The gait trainer is tested with subjects, and satisfactory results were achieved.Keywords: brain computer interface (BCI), gait trainer, spinal cord injury (SCI), neurorehabilitation
Procedia PDF Downloads 161