Search results for: features based techniques
33606 A Decision Support Framework for Introducing Business Intelligence to Midlands Based SMEs
Authors: Amritpal Slaich, Mark Elshaw
Abstract:
This paper explores the development of a decision support framework for the introduction of business intelligence (BI) through operational research techniques for application by SMEs. Aligned with the goals of the new Midlands Enterprise Initiative of improving the skill levels of the Midlands workforce and addressing high levels of regional unemployment, we have developed a framework to increase the level of business intelligence used by SMEs to improve business decision-making. Many SMEs in the Midlands fail due to the lack of high quality decision making. Our framework outlines how universities can: engage with SMEs in the use of BI through operational research techniques; develop appropriate and easy to use Excel spreadsheet models; and make use of a process to allow SMEs to feedback their findings of the models. Future work will determine how well the framework performs in getting SMEs to apply BI to improve their decision-making performance.Keywords: SMEs, decision support framework, business intelligence, operational research techniques
Procedia PDF Downloads 46933605 Improving Security Features of Traditional Automated Teller Machines-Based Banking Services via Fingerprint Biometrics Scheme
Authors: Anthony I. Otuonye, Juliet N. Odii, Perpetual N. Ibe
Abstract:
The obvious challenges faced by most commercial bank customers while using the services of ATMs (Automated Teller Machines) across developing countries have triggered the need for an improved system with better security features. Current ATM systems are password-based, and research has proved the vulnerabilities of these systems to heinous attacks and manipulations. We have discovered by research that the security of current ATM-assisted banking services in most developing countries of the world is easily broken and maneuvered by fraudsters, majorly because it is quite difficult for these systems to identify an impostor with privileged access as against the authentic bank account owner. Again, PIN (Personal Identification Number) code passwords are easily guessed, just to mention a few of such obvious limitations of traditional ATM operations. In this research work also, we have developed a system of fingerprint biometrics with PIN code Authentication that seeks to improve the security features of traditional ATM installations as well as other Banking Services. The aim is to ensure better security at all ATM installations and raise the confidence of bank customers. It is hoped that our system will overcome most of the challenges of the current password-based ATM operation if properly applied. The researchers made use of the OOADM (Object-Oriented Analysis and Design Methodology), a software development methodology that assures proper system design using modern design diagrams. Implementation and coding were carried out using Visual Studio 2010 together with other software tools. Results obtained show a working system that provides two levels of security at the client’s side using a fingerprint biometric scheme combined with the existing 4-digit PIN code to guarantee the confidence of bank customers across developing countries.Keywords: fingerprint biometrics, banking operations, verification, ATMs, PIN code
Procedia PDF Downloads 4133604 A Survey on Types of Noises and De-Noising Techniques
Authors: Amandeep Kaur
Abstract:
Digital Image processing is a fundamental tool to perform various operations on the digital images for pattern recognition, noise removal and feature extraction. In this paper noise removal technique has been described for various types of noises. This paper comprises discussion about various noises available in the image due to different environmental, accidental factors. In this paper, various de-noising approaches have been discussed that utilize different wavelets and filters for de-noising. By analyzing various papers on image de-noising we extract that wavelet based de-noise approaches are much effective as compared to others.Keywords: de-noising techniques, edges, image, image processing
Procedia PDF Downloads 33433603 Privacy Preserving in Association Rule Mining on Horizontally Partitioned Database
Authors: Manvar Sagar, Nikul Virpariya
Abstract:
The advancement in data mining techniques plays an important role in many applications. In context of privacy and security issues, the problems caused by association rule mining technique are investigated by many research scholars. It is proved that the misuse of this technique may reveal the database owner’s sensitive and private information to others. Many researchers have put their effort to preserve privacy in Association Rule Mining. Amongst the two basic approaches for privacy preserving data mining, viz. Randomization based and Cryptography based, the later provides high level of privacy but incurs higher computational as well as communication overhead. Hence, it is necessary to explore alternative techniques that improve the over-heads. In this work, we propose an efficient, collusion-resistant cryptography based approach for distributed Association Rule mining using Shamir’s secret sharing scheme. As we show from theoretical and practical analysis, our approach is provably secure and require only one time a trusted third party. We use secret sharing for privately sharing the information and code based identification scheme to add support against malicious adversaries.Keywords: Privacy, Privacy Preservation in Data Mining (PPDM), horizontally partitioned database, EMHS, MFI, shamir secret sharing
Procedia PDF Downloads 40633602 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques
Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari
Abstract:
Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding
Procedia PDF Downloads 15733601 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments
Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda
Abstract:
In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction
Procedia PDF Downloads 51333600 CRISPR Technology: A Tool in the Potential Cure for COVID-19 Virus
Authors: Chijindu Okpalaoka, Charles Chinedu Onuselogu
Abstract:
COVID-19, humanity's coronavirus disease caused by SARS-CoV-2, was first detected in late 2019 in Wuhan, China. COVID-19 lacked an established conventional pharmaceutical therapy, and as a result, the outbreak quickly became an epidemic affecting the entire World. Only a qPCR assay is reliable for diagnosing COVID-19. Clustered, regularly interspaced short palindromic repeats (CRISPR) technology is being researched for speedy and specific identification of COVID-19, among other therapeutic techniques. Apart from its therapeutic capabilities, the CRISPR technique is being evaluated to develop antiviral therapies; nevertheless, no CRISPR-based medication has been approved for human use to date. Prophylactic antiviral CRISPR in living being cells, a Cas 13-based approach against coronavirus, has been developed. While this method can be evolved into a treatment approach, it may face substantial obstacles in human clinical trials for licensure. This study discussed the potential applications of CRISPR-based techniques for developing a speedy and accurate feasible treatment alternative for the COVID-19 virus.Keywords: COVID-19, CRISPR technique, Cas13, SARS-CoV-2, prophylactic antiviral
Procedia PDF Downloads 12333599 Choral Singers' Preference for Expressive Priming Techniques
Authors: Shawn Michael Condon
Abstract:
Current research on teaching expressivity mainly involves instrumentalists. This study focuses on choral singers’ preference of priming techniques based on four methods for teaching expressivity. 112 choral singers answered the survey about their preferred methods for priming expressivity (vocal modelling, using metaphor, tapping into felt emotions, and drawing on past experiences) in three conditions (active, passive, and instructor). Analysis revealed higher preference for drawing on past experience among more experienced singers. The most preferred technique in the passive and instructor roles was vocal modelling, with metaphors and tapping into felt emotions favoured in an active role. Priming techniques are often used in combination with other methods to enhance singing technique or expressivity and are dependent upon the situation, repertoire, and the preferences of the instructor and performer.Keywords: emotion, expressivity, performance, singing, teaching
Procedia PDF Downloads 15433598 Statistical Tools for SFRA Diagnosis in Power Transformers
Authors: Rahul Srivastava, Priti Pundir, Y. R. Sood, Rajnish Shrivastava
Abstract:
For the interpretation of the signatures of sweep frequency response analysis(SFRA) of transformer different types of statistical techniques serves as an effective tool for doing either phase to phase comparison or sister unit comparison. In this paper with the discussion on SFRA several statistics techniques like cross correlation coefficient (CCF), root square error (RSQ), comparative standard deviation (CSD), Absolute difference, mean square error(MSE),Min-Max ratio(MM) are presented through several case studies. These methods require sample data size and spot frequencies of SFRA signatures that are being compared. The techniques used are based on power signal processing tools that can simplify result and limits can be created for the severity of the fault occurring in the transformer due to several short circuit forces or due to ageing. The advantages of using statistics techniques for analyzing of SFRA result are being indicated through several case studies and hence the results are obtained which determines the state of the transformer.Keywords: absolute difference (DABS), cross correlation coefficient (CCF), mean square error (MSE), min-max ratio (MM-ratio), root square error (RSQ), standard deviation (CSD), sweep frequency response analysis (SFRA)
Procedia PDF Downloads 69533597 Energy-Aware Scheduling in Real-Time Systems: An Analysis of Fair Share Scheduling and Priority-Driven Preemptive Scheduling
Authors: Su Xiaohan, Jin Chicheng, Liu Yijing, Burra Venkata Durga Kumar
Abstract:
Energy-aware scheduling in real-time systems aims to minimize energy consumption, but issues related to resource reservation and timing constraints remain challenges. This study focuses on analyzing two scheduling algorithms, Fair-Share Scheduling (FFS) and Priority-Driven Preemptive Scheduling (PDPS), for solving these issues and energy-aware scheduling in real-time systems. Based on research on both algorithms and the processes of solving two problems, it can be found that Fair-Share Scheduling ensures fair allocation of resources but needs to improve with an imbalanced system load, and Priority-Driven Preemptive Scheduling prioritizes tasks based on criticality to meet timing constraints through preemption but relies heavily on task prioritization and may not be energy efficient. Therefore, improvements to both algorithms with energy-aware features will be proposed. Future work should focus on developing hybrid scheduling techniques that minimize energy consumption through intelligent task prioritization, resource allocation, and meeting time constraints.Keywords: energy-aware scheduling, fair-share scheduling, priority-driven preemptive scheduling, real-time systems, optimization, resource reservation, timing constraints
Procedia PDF Downloads 11733596 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples
Authors: Saifullah Karimullah
Abstract:
Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine
Procedia PDF Downloads 10233595 Impact of Pedagogical Techniques on the Teaching of Sports Sciences
Authors: Muhammad Saleem
Abstract:
Background: The teaching of sports sciences encompasses a broad spectrum of disciplines, including biomechanics, physiology, psychology, and coaching. Effective pedagogical techniques are crucial in imparting both theoretical knowledge and practical skills necessary for students to excel in the field. The impact of these techniques on students’ learning outcomes, engagement, and professional preparedness remains a vital area of study. Objective: This study aims to evaluate the effectiveness of various pedagogical techniques used in the teaching of sports sciences. It seeks to identify which methods most significantly enhance student learning, retention, engagement, and practical application of knowledge. Methods: A mixed-methods approach was employed, including both quantitative and qualitative analyses. The study involved a comparative analysis of traditional lecture-based teaching, experiential learning, problem-based learning (PBL), and technology-enhanced learning (TEL). Data were collected through surveys, interviews, and academic performance assessments from students enrolled in sports sciences programs at multiple universities. Statistical analysis was used to evaluate academic performance, while thematic analysis was applied to qualitative data to capture student experiences and perceptions. Results: The findings indicate that experiential learning and PBL significantly improve students' understanding and retention of complex sports science concepts compared to traditional lectures. TEL was found to enhance engagement and provide students with flexible learning opportunities, but its impact on deep learning varied depending on the quality of the digital resources. Overall, a combination of experiential learning, PBL, and TEL was identified as the most effective pedagogical approach, leading to higher student satisfaction and better preparedness for real-world applications. Conclusion: The study underscores the importance of adopting diverse and student-centered pedagogical techniques in the teaching of sports sciences. While traditional lectures remain useful for foundational knowledge, integrating experiential learning, PBL, and TEL can substantially improve student outcomes. These findings suggest that educators should consider a blended approach to pedagogy to maximize the effectiveness of sports science education.Keywords: sport sciences, pedagogical techniques, health and physical education, problem-based learning, student engagement
Procedia PDF Downloads 2333594 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids
Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone
Abstract:
Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain
Procedia PDF Downloads 46733593 Development of an Information System Based on the Establishment and Evaluation of Performance Rating by Application Part/Type of Remodeling Element Technologies
Authors: Sungwon Jung
Abstract:
The percentage of 20 years or older apartment houses in South Korea is approximately 20% (1.55 million houses), and the explosive increase of aged houses is expected around the first planned new towns. Accordingly, we should prepare for social issues such as difficulty of housing lease and degradation of housing performance. The improvement of performance of aged houses is essential for achieving the national energy and carbon reduction goals, and we should develop techniques to respond to the changing construction environment. Furthermore, we should develop a performance evaluation system that is appropriate for the demands of residents such as the improvement of remodeling floor plan by performance improvement in line with the residence type of the housing vulnerable groups such as low-income group and elderly people living alone. For this purpose, remodeling techniques and business models optimized for the target complexes must be spread through the development of various business models. In addition, it is necessary to improve the remodeling business by improving the laws and systems related to the improvement of the residential performance and to prepare techniques to respond to the increasing business demands. In other words, performance improvement and evaluation and knowledge systems need to be researched as new issues related to remodeling that has not been addressed in the existing research.Keywords: remodelling, performance evaluation, web-based system, big data
Procedia PDF Downloads 22333592 The Lubrication Regimes Recognition of a Pressure-Fed Journal Bearing by Time and Frequency Domain Analysis of Acoustic Emission Signals
Authors: S. Hosseini, M. Ahmadi Najafabadi, M. Akhlaghi
Abstract:
The health of the journal bearings is very important in preventing unforeseen breakdowns in rotary machines, and poor lubrication is one of the most important factors for producing the bearing failures. Hydrodynamic lubrication (HL), mixed lubrication (ML), and boundary lubrication (BL) are three regimes of a journal bearing lubrication. This paper uses acoustic emission (AE) measurement technique to correlate features of the AE signals to the three lubrication regimes. The transitions from HL to ML based on operating factors such as rotating speed, load, inlet oil pressure by time domain and time-frequency domain signal analysis techniques are detected, and then metal-to-metal contacts between sliding surfaces of the journal and bearing are identified. It is found that there is a significant difference between theoretical and experimental operating values that are obtained for defining the lubrication regions.Keywords: acoustic emission technique, pressure fed journal bearing, time and frequency signal analysis, metal-to-metal contact
Procedia PDF Downloads 15433591 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions
Procedia PDF Downloads 47633590 Real Time Multi Person Action Recognition Using Pose Estimates
Authors: Aishrith Rao
Abstract:
Human activity recognition is an important aspect of video analytics, and many approaches have been recommended to enable action recognition. In this approach, the model is used to identify the action of the multiple people in the frame and classify them accordingly. A few approaches use RNNs and 3D CNNs, which are computationally expensive and cannot be trained with the small datasets which are currently available. Multi-person action recognition has been performed in order to understand the positions and action of people present in the video frame. The size of the video frame can be adjusted as a hyper-parameter depending on the hardware resources available. OpenPose has been used to calculate pose estimate using CNN to produce heap-maps, one of which provides skeleton features, which are basically joint features. The features are then extracted, and a classification algorithm can be applied to classify the action.Keywords: human activity recognition, computer vision, pose estimates, convolutional neural networks
Procedia PDF Downloads 13833589 Reducing Crash Risk at Intersections with Safety Improvements
Authors: Upal Barua
Abstract:
Crash risk at intersections is a critical safety issue. This paper examines the effectiveness of removing an existing off-set at an intersection by realignment, in reducing crashes. Empirical Bayes method was applied to conduct a before-and-after study to assess the effect of this safety improvement. The Transportation Safety Improvement Program in Austin Transportation Department completed several safety improvement projects at high crash intersections with a view to reducing crashes. One of the common safety improvement techniques applied was the realignment of intersection approaches removing an existing off-set. This paper illustrates how this safety improvement technique is applied at a high crash intersection from inception to completion. This paper also highlights the significant crash reductions achieved from this safety improvement technique applying Empirical Bayes method in a before-and-after study. The result showed that realignment of intersection approaches removing an existing off-set can reduce crashes by 53%. This paper also features the state of the art techniques applied in planning, engineering, designing and construction of this safety improvement, key factors driving the success, and lessons learned in the process.Keywords: crash risk, intersection, off-set, safety improvement technique, before-and-after study, empirical Bayes method
Procedia PDF Downloads 24233588 Measuring Text-Based Semantics Relatedness Using WordNet
Authors: Madiha Khan, Sidrah Ramzan, Seemab Khan, Shahzad Hassan, Kamran Saeed
Abstract:
Measuring semantic similarity between texts is calculating semantic relatedness between texts using various techniques. Our web application (Measuring Relatedness of Concepts-MRC) allows user to input two text corpuses and get semantic similarity percentage between both using WordNet. Our application goes through five stages for the computation of semantic relatedness. Those stages are: Preprocessing (extracts keywords from content), Feature Extraction (classification of words into Parts-of-Speech), Synonyms Extraction (retrieves synonyms against each keyword), Measuring Similarity (using keywords and synonyms, similarity is measured) and Visualization (graphical representation of similarity measure). Hence the user can measure similarity on basis of features as well. The end result is a percentage score and the word(s) which form the basis of similarity between both texts with use of different tools on same platform. In future work we look forward for a Web as a live corpus application that provides a simpler and user friendly tool to compare documents and extract useful information.Keywords: Graphviz representation, semantic relatedness, similarity measurement, WordNet similarity
Procedia PDF Downloads 23533587 Rendering Cognition Based Learning in Coherence with Development within the Context of PostgreSQL
Authors: Manuela Nayantara Jeyaraj, Senuri Sucharitharathna, Chathurika Senarath, Yasanthy Kanagaraj, Indraka Udayakumara
Abstract:
PostgreSQL is an Object Relational Database Management System (ORDBMS) that has been in existence for a while. Despite the superior features that it wraps and packages to manage database and data, the database community has not fully realized the importance and advantages of PostgreSQL. Hence, this research tends to focus on provisioning a better environment of development for PostgreSQL in order to induce the utilization and elucidate the importance of PostgreSQL. PostgreSQL is also known to be the world’s most elementary SQL-compliant open source ORDBMS. But, users have not yet resolved to PostgreSQL due to the facts that it is still under the layers and the complexity of its persistent textual environment for an introductory user. Simply stating this, there is a dire need to explicate an easy way of making the users comprehend the procedure and standards with which databases are created, tables and the relationships among them, manipulating queries and their flow based on conditions in PostgreSQL to help the community resolve to PostgreSQL at an augmented rate. Hence, this research under development within the context tends to initially identify the dominant features provided by PostgreSQL over its competitors. Following the identified merits, an analysis on why the database community holds a hesitance in migrating to PostgreSQL’s environment will be carried out. These will be modulated and tailored based on the scope and the constraints discovered. The resultant of the research proposes a system that will serve as a designing platform as well as a learning tool that will provide an interactive method of learning via a visual editor mode and incorporate a textual editor for well-versed users. The study is based on conjuring viable solutions that analyze a user’s cognitive perception in comprehending human computer interfaces and the behavioural processing of design elements. By providing a visually draggable and manipulative environment to work with Postgresql databases and table queries, it is expected to highlight the elementary features displayed by Postgresql over any other existent systems in order to grasp and disseminate the importance and simplicity offered by this to a hesitant user.Keywords: cognition, database, PostgreSQL, text-editor, visual-editor
Procedia PDF Downloads 28233586 An E-Retailing System Architecture Based on Cloud Computing
Authors: Chanchai Supaartagorn
Abstract:
E-retailing is the sale of goods online that takes place over the Internet. The Internet has shrunk the entire World. The world e-retailing is growing at an exponential rate in the Americas, Europe, and Asia. However, e-retailing costs require expensive investment, such as hardware, software, and security systems. Cloud computing technology is internet-based computing for the management and delivery of applications and services. Cloud-based e-retailing application models allow enterprises to lower their costs with their effective implementation of e-retailing activities. In this paper, we describe the concept of cloud computing and present the architecture of cloud computing, combining the features of e-retailing. In addition, we propose a strategy for implementing cloud computing with e-retailing. Finally, we explain the benefits from the architecture.Keywords: architecture, cloud computing, e-retailing, internet-based
Procedia PDF Downloads 39433585 Comparative Analysis of Spectral Estimation Methods for Brain-Computer Interfaces
Authors: Rafik Djemili, Hocine Bourouba, M. C. Amara Korba
Abstract:
In this paper, we present a method in order to classify EEG signals for Brain-Computer Interfaces (BCI). EEG signals are first processed by means of spectral estimation methods to derive reliable features before classification step. Spectral estimation methods used are standard periodogram and the periodogram calculated by the Welch method; both methods are compared with Logarithm of Band Power (logBP) features. In the method proposed, we apply Linear Discriminant Analysis (LDA) followed by Support Vector Machine (SVM). Classification accuracy reached could be as high as 85%, which proves the effectiveness of classification of EEG signals based BCI using spectral methods.Keywords: brain-computer interface, motor imagery, electroencephalogram, linear discriminant analysis, support vector machine
Procedia PDF Downloads 49833584 Money Laundering and Governance in Cryptocurrencies: The Double-Edged Sword of Blockchain Technology
Abstract:
With the growing popularity of bitcoin transactions, criminals have exploited the bitcoin like cryptocurrencies, and cybercriminals such as money laundering have thrived. Unlike traditional currencies, the Internet-based virtual currencies can be used anonymously via the blockchain technology underpinning. In this paper, we analyze the double-edged sword features of blockchain technology in the context of money laundering. In particular, the traceability feature of blockchain-based system facilitates a level of governance, while the decentralization feature of blockchain-based system may bring governing difficulties. Based on the analysis, we propose guidelines for policy makers in governing blockchain-based cryptocurrency systems.Keywords: cryptocurrency, money laundering, blockchain, decentralization, traceability
Procedia PDF Downloads 20033583 Improving Vocabulary and Listening Comprehension via Watching French Films without Subtitles: Positive Results
Authors: Yelena Mazour-Matusevich, Jean-Robert Ancheta
Abstract:
This study is based on more than fifteen years of experience of teaching a foreign language, in my case French, to the English-speaking students. It represents a qualitative research on foreign language learners’ reaction and their gains in terms of vocabulary and listening comprehension through repeatedly viewing foreign feature films with the original sountrack but without English subtitles. The initial idea emerged upon realization that the first challenge faced by my students when they find themselves in a francophone environment has been their lack of listening comprehension. Their inability to understand colloquial speech affects not only their academic performance, but their psychological health as well. To remedy this problem, I have designed and applied for many years my own teaching method based on one particular French film, exceptionally suited, for the reasons described in detail in the paper, for the intermediate-advanced level foreign language learners. This project, conducted together with my undergraduate assistant and mentoree J-R Ancheta, aims at showing how the paralinguistic features, such as characters’ facial expressions, settings, music, historical background, images provided before the actual viewing, etc., offer crucial support and enhance students’ listening comprehension. The study, based on students’ interviews, also offers special pedagogical techniques, such as ‘anticipatory’ vocabulary lists and exercises, drills, quizzes and composition topics that have proven to boost students’ performance. For this study, only the listening proficiency and vocabulary gains of the interviewed participants were assessed.Keywords: comprehension, film, listening, subtitles, vocabulary
Procedia PDF Downloads 62333582 Lexical Semantic Analysis to Support Ontology Modeling of Maintenance Activities– Case Study of Offshore Riser Integrity
Authors: Vahid Ebrahimipour
Abstract:
Word representation and context meaning of text-based documents play an essential role in knowledge modeling. Business procedures written in natural language are meant to store technical and engineering information, management decision and operation experience during the production system life cycle. Context meaning representation is highly dependent upon word sense, lexical relativity, and sematic features of the argument. This paper proposes a method for lexical semantic analysis and context meaning representation of maintenance activity in a mass production system. Our approach constructs a straightforward lexical semantic approach to analyze facilitates semantic and syntactic features of context structure of maintenance report to facilitate translation, interpretation, and conversion of human-readable interpretation into computer-readable representation and understandable with less heterogeneity and ambiguity. The methodology will enable users to obtain a representation format that maximizes shareability and accessibility for multi-purpose usage. It provides a contextualized structure to obtain a generic context model that can be utilized during the system life cycle. At first, it employs a co-occurrence-based clustering framework to recognize a group of highly frequent contextual features that correspond to a maintenance report text. Then the keywords are identified for syntactic and semantic extraction analysis. The analysis exercises causality-driven logic of keywords’ senses to divulge the structural and meaning dependency relationships between the words in a context. The output is a word contextualized representation of maintenance activity accommodating computer-based representation and inference using OWL/RDF.Keywords: lexical semantic analysis, metadata modeling, contextual meaning extraction, ontology modeling, knowledge representation
Procedia PDF Downloads 10433581 Simulation versus Hands-On Learning Methodologies: A Comparative Study for Engineering and Technology Curricula
Authors: Mohammed T. Taher, Usman Ghani, Ahmed S. Khan
Abstract:
This paper compares the findings of two studies conducted to determine the effectiveness of simulation-based, hands-on and feedback mechanism on students learning by answering the following questions: 1). Does the use of simulation improve students’ learning outcomes? 2). How do students perceive the instructional design features embedded in the simulation program such as exploration and scaffolding support in learning new concepts? 3.) What is the effect of feedback mechanisms on students’ learning in the use of simulation-based labs? The paper also discusses the other aspects of findings which reveal that simulation by itself is not very effective in promoting student learning. Simulation becomes effective when it is followed by hands-on activity and feedback mechanisms. Furthermore, the paper presents recommendations for improving student learning through the use of simulation-based, hands-on, and feedback-based teaching methodologies.Keywords: simulation-based teaching, hands-on learning, feedback-based learning, scaffolding
Procedia PDF Downloads 46033580 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features
Authors: Bo Wang
Abstract:
The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection
Procedia PDF Downloads 28333579 QSAR, Docking and E-pharmacophore Approach on Novel Series of HDAC Inhibitors with Thiophene Linker as Anticancer Agents
Authors: Harish Rajak, Preeti Patel
Abstract:
HDAC inhibitors can reactivate gene expression and inhibit the growth and survival of cancer cells. The 3D-QSAR and Pharmacophore modeling studies were performed to identify important pharmacophoric features and correlate 3D-chemical structure with biological activity. The pharmacophore hypotheses were developed using e-pharmacophore script and phase module. Pharmacophore hypothesis represents the 3D arrangement of molecular features necessary for activity. A series of 55 compounds with well-assigned HDAC inhibitory activity was used for 3D-QSAR model development. Best 3D-QSAR model, which is a five PLS factor model with good statistics and predictive ability, acquired Q2 (0.7293), R2 (0.9811) and standard deviation (0.0952). Molecular docking were performed using Histone Deacetylase protein (PDB ID: 1t69) and prepared series of hydroxamic acid based HDAC inhibitors. Docking study of compound 43 show significant binding interactions Ser 276 and oxygen atom of dioxine cap region, Gly 151 and amino group and Asp 267 with carboxyl group of CONHOH, which are essential for anticancer activity. On docking, most of the compounds exhibited better glide score values between -8 to -10.5. We have established structure activity correlation using docking, energetic based pharmacophore modelling, pharmacophore and atom based 3D QSAR model. The results of these studies were further used for the design and testing of new HDAC analogs.Keywords: Docking, e-pharmacophore, HDACIs, QSAR, Suberoylanilidehydroxamic acid.
Procedia PDF Downloads 29833578 Identifying the Structural Components of Old Buildings from Floor Plans
Authors: Shi-Yu Xu
Abstract:
The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence
Procedia PDF Downloads 8733577 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 122