Search results for: Test automation tools
3917 Deep Reinforcement Learning Approach for Trading Automation in the Stock Market
Authors: Taylan Kabbani, Ekrem Duman
Abstract:
Deep Reinforcement Learning (DRL) algorithms can scale to previously intractable problems. The automation of profit generation in the stock market is possible using DRL, by combining the financial assets price ”prediction” step and the ”allocation” step of the portfolio in one unified process to produce fully autonomous systems capable of interacting with its environment to make optimal decisions through trial and error. This work represents a DRL model to generate profitable trades in the stock market, effectively overcoming the limitations of supervised learning approaches. We formulate the trading problem as a Partially observed Markov Decision Process (POMDP) model, considering the constraints imposed by the stock market, such as liquidity and transaction costs. We then solved the formulated POMDP problem using the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm and achieved a 2.68 Sharpe ratio on the test dataset. From the point of view of stock market forecasting and the intelligent decision-making mechanism, this paper demonstrates the superiority of DRL in financial markets over other types of machine learning and proves its credibility and advantages of strategic decision-making.
Keywords: Autonomous agent, deep reinforcement learning, MDP, sentiment analysis, stock market, technical indicators, twin delayed deep deterministic policy gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5243916 Tool Tracker: A Toolkit Ensembling Useful Online Networking Tools for Efficient Management and Operation of a Network
Authors: Onkar Bhat Kodical, Sridhar Srinivasan, N.K. Srinath
Abstract:
Tool Tracker is a client-server based application. It is essentially a catalogue of various network monitoring and management tools that are available online. There is a database maintained on the server side that contains the information about various tools. Several clients can access this information simultaneously and utilize this information. The various categories of tools considered are packet sniffers, port mappers, port scanners, encryption tools, and vulnerability scanners etc for the development of this application. This application provides a front end through which the user can invoke any tool from a central repository for the purpose of packet sniffing, port scanning, network analysis etc. Apart from the tool, its description and the help files associated with it would also be stored in the central repository. This facility will enable the user to view the documentation pertaining to the tool without having to download and install the tool. The application would update the central repository with the latest versions of the tools. The application would inform the user about the availability of a newer version of the tool currently being used and give the choice of installing the newer version to the user. Thus ToolTracker provides any network administrator that much needed abstraction and ease-ofuse with respect to the tools that he can use to efficiently monitor a network.
Keywords: Network monitoring, single platform, client/server application, version management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13003915 Survey to Assess the Feasibility of Executing the Web-Based Collaboration Process Using WBCS
Authors: Mohamed A. Sullabi
Abstract:
The importance of the formal specification in the software life cycle is barely concealing to anyone. Formal specifications use mathematical notation to describe the properties of information system precisely, without unduly constraining the way in how these properties are achieved. Having a correct and quality software specification is not easy task. This study concerns with how a group of rectifiers can communicate with each other and work to prepare and produce a correct formal software specification. WBCS has been implemented based mainly in the proposed supported cooperative work model and a survey conducted on the existing Webbased collaborative writing tools. This paper aims to assess the feasibility of executing the web-based collaboration process using WBCS. The purpose of conducting this test is to test the system as a whole for functionality and fitness for use based on the evaluation test plan.
Keywords: Formal methods, Formal specifications, collaborative writing, Usability testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17093914 Importance of Knowledge in the Interdisciplinary Production Processes of Innovative Medical Tools
Authors: Katarzyna Mleczko
Abstract:
Processes of production of innovative medical tools have interdisciplinary character. They consist of direct and indirect close cooperation of specialists of different scientific branches. The Knowledge they have seems to be important for undertaken design, construction and manufacturing processes. The Knowledge exchange between participants of these processes is therefore crucial for the final result, which are innovative medical products. The paper draws attention to the necessity of feedback from the end user to the designer / manufacturer of medical tools which will allow for more accurate understanding of user needs. The study describes prerequisites of production processes of innovative medical (surgical) tools including participants and category of knowledge resources occurring in these processes. They are the result of research in selected Polish organizations involved in the production of medical instruments and are the basis for further work on the development of knowledge sharing model in interdisciplinary teams geographically dispersed.
Keywords: Interdisciplinary production processes, knowledge exchange, knowledge sharing, medical tools, user-centered design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15213913 Process Optimization and Automation of Information Technology Services in a Heterogenic Digital Environment
Authors: Tasneem Halawani, Yamen Khateeb
Abstract:
With customers’ ever-increasing expectations for fast services provisioning for all their business needs, information technology (IT) organizations, as business partners, have to cope with this demanding environment and deliver their services in the most effective and efficient way. The purpose of this paper is to identify optimization and automation opportunities for the top requested IT services in a heterogenic digital environment and widely spread customer base. In collaboration with systems, processes, and subject matter experts (SMEs), the processes in scope were approached by analyzing four-year related historical data, identifying and surveying stakeholders, modeling the as-is processes, and studying systems integration/automation capabilities. This effort resulted in identifying several pain areas, including standardization, unnecessary customer and IT involvement, manual steps, systems integration, and performance measurement. These pain areas were addressed by standardizing the top five requested IT services, eliminating/automating 43 steps, and utilizing a single platform for end-to-end process execution. In conclusion, the optimization of IT service request processes in a heterogenic digital environment and widely spread customer base is challenging, yet achievable without compromising the service quality and customers’ added value. Further studies can focus on measuring the value of the eliminated/automated process steps to quantify the enhancement impact. Moreover, a similar approach can be utilized to optimize other IT service requests, with a focus on business criticality.Keywords: Automation, customer value, heterogenic, integration, IT services, optimization, processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6653912 Assessment of Energy Demand Considering Different Model Simulations in a Low Energy Demand House
Authors: M. Cañada-Soriano, C. Aparicio-Fernández, P. Sebastián Ferrer Gisbert, M. Val Field, J.-L. Vivancos-Bono
Abstract:
The lack of insulation along with the existence of air leakages constitute a meaningful impact on the energy performance of buildings. Both of them lead to increases in the energy demand through additional heating and/or cooling loads. Additionally, they cause thermal discomfort. In order to quantify these uncontrolled air currents, the Blower Door test can be used. It is a standardized procedure that determines the airtightness of a space by characterizing the rate of air leakages through the envelope surface. In this sense, the low-energy buildings complying with the Passive House design criteria are required to achieve high levels of airtightness. Due to the invisible nature of air leakages, additional tools are often considered to identify where the infiltrations take place such as the infrared thermography. The aim of this study is to assess the airtightness of a typical Mediterranean dwelling house, refurbished under the Passive House standard, using the Blower Door test. Moreover, the building energy performance modelling tools TRNSYS (TRaNsient System Simulation program) and TRNFlow (TRaNsient Flow) have been used to estimate the energy demand in different scenarios. In this sense, a sequential implementation of three different energy improvement measures (insulation thickness, glazing type and infiltrations) have been analyzed.
Keywords: Airtightness, blower door, TRNSYS, infrared thermography, energy demand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2213911 A Robust Al-Hawalees Gaming Automation using Minimax and BPNN Decision
Authors: Ahmad Sharieh, R Bremananth
Abstract:
Artificial Intelligence based gaming is an interesting topic in the state-of-art technology. This paper presents an automation of a tradition Omani game, called Al-Hawalees. Its related issues are resolved and implemented using artificial intelligence approach. An AI approach called mini-max procedure is incorporated to make a diverse budges of the on-line gaming. If number of moves increase, time complexity will be increased in terms of propositionally. In order to tackle the time and space complexities, we have employed a back propagation neural network (BPNN) to train in off-line to make a decision for resources required to fulfill the automation of the game. We have utilized Leverberg- Marquardt training in order to get the rapid response during the gaming. A set of optimal moves is determined by the on-line back propagation training fashioned with alpha-beta pruning. The results and analyses reveal that the proposed scheme will be easily incorporated in the on-line scenario with one player against the system.
Keywords: Artificial neural network, back propagation gaming, Leverberg-Marquardt, minimax procedure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19363910 The Different Ways to Describe Regular Languages by Using Finite Automata and the Changing Algorithm Implementation
Authors: Abdulmajid Mukhtar Afat
Abstract:
This paper aims at introducing finite automata theory, the different ways to describe regular languages and create a program to implement the subset construction algorithms to convert nondeterministic finite automata (NFA) to deterministic finite automata (DFA). This program is written in c++ programming language. The program reads FA 5tuples from text file and then classifies it into either DFA or NFA. For DFA, the program will read the string w and decide whether it is acceptable or not. If accepted, the program will save the tracking path and point it out. On the other hand, when the automation is NFA, the program will change the Automation to DFA so that it is easy to track and it can decide whether the w exists in the regular language or not.
Keywords: Finite Automata, subset construction DFA, NFA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19863909 Validation of Automotive Centrals Using Hardware in the Loop-Body Control Unit and Lights
Authors: Marley Rosa Luciano, Rodney Rezende Saldanha
Abstract:
The race for electrification and the need for innovation to attract customers has led the automotive industry to do something different with vehicles. New emissions control challenges and efficient technological availability are the pillars of creation. The growing demand to upgrade industrial manufacturing systems creates actions that directly impact vehicle production. With this comes the search for new prototyping methods and virtual tools for component testing and validation, and vehicle systems have established themselves. The demand for Electronic Control Units (ECU) is increasing due to the availability of intelligence and safety in today's vehicles, directly affecting their development, performance, and functional testing. In order to keep up with global changes, the automotive industry uses different virtual environments to produce, verify and validate their vehicles and test prototypes used during development. Therefore, in this paper, integration and validation were performed using the Hardware in the Loop (HIL) test platform, focusing on the ECU Body Control Module (BCM). Then, a brief commentary reviews other test medium platforms, such as the Plywood Buck (PWB), and examines the reliability, flexibility, installation time, and cost of the three test platforms, software in the loop (SIL), Model in the loop (MIL), and HIL, to review their benefits, challenges, and issues in use and information to optimize the use of each platform and test medium.
Keywords: Automotive, Electronic Central Unit, xIL, Hardware in the loop.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3233908 A Modified Run Length Coding Technique for Test Data Compression Based on Multi-Level Selective Huffman Coding
Authors: C. Kalamani, K. Paramasivam
Abstract:
Test data compression is an efficient method for reducing the test application cost. The problem of reducing test data has been addressed by researchers in three different aspects: Test Data Compression, Built-in-Self-Test (BIST) and Test set compaction. The latter two methods are capable of enhancing fault coverage with cost of hardware overhead. The drawback of the conventional methods is that they are capable of reducing the test storage and test power but when test data have redundant length of runs, no additional compression method is followed. This paper presents a modified Run Length Coding (RLC) technique with Multilevel Selective Huffman Coding (MLSHC) technique to reduce test data volume, test pattern delivery time and power dissipation in scan test applications where redundant length of runs is encountered then the preceding run symbol is replaced with tiny codeword. Experimental results show that the presented method not only improves the test data compression but also reduces the overall test data volume compared to recent schemes. Experiments for the six largest ISCAS-98 benchmarks show that our method outperforms most known techniques.
Keywords: Modified run length coding, multilevel selective Huffman coding, built-in-self-test modified selective Huffman coding, automatic test equipment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12743907 Test Data Compression Using a Hybrid of Bitmask Dictionary and 2n Pattern Runlength Coding Methods
Authors: C. Kalamani, K. Paramasivam
Abstract:
In VLSI, testing plays an important role. Major problem in testing are test data volume and test power. The important solution to reduce test data volume and test time is test data compression. The Proposed technique combines the bit maskdictionary and 2n pattern run length-coding method and provides a substantial improvement in the compression efficiency without introducing any additional decompression penalty. This method has been implemented using Mat lab and HDL Language to reduce test data volume and memory requirements. This method is applied on various benchmark test sets and compared the results with other existing methods. The proposed technique can achieve a compression ratio up to 86%.Keywords: Bit Mask dictionary, 2n pattern run length code, system-on-chip, SOC, test data compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19213906 Intelligent Aid-Analysis Based on the Use of Digital Twin: Application to Electronic Warfare System
Authors: L. Chaussy, M. Nouvel
Abstract:
Workload of the system engineers during Integration Validation Verification process of Electronic Warfare Systems (EWS) is growing with complexity of the systems and with the diversity of tested cases (diversity of operational scenario in front of EWS). Even if the use of Digital Twin makes easier conception and development phases in term of planning and test equipment availability, time to analyze tests results is still too long and too complex. The idea to reduce the system engineer’s workload and improve test coverage is to introduce some intelligent and aid-analysis algorithms to improve this step.
Keywords: Analysis tools, automatic testing, digital twin, electronic warfare system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3853905 A Survey on Performance Tools for OpenMP
Authors: Mubarak S. Mohsen, Rosni Abdullah, Yong M. Teo
Abstract:
Advances in processors architecture, such as multicore, increase the size of complexity of parallel computer systems. With multi-core architecture there are different parallel languages that can be used to run parallel programs. One of these languages is OpenMP which embedded in C/Cµ or FORTRAN. Because of this new architecture and the complexity, it is very important to evaluate the performance of OpenMP constructs, kernels, and application program on multi-core systems. Performance is the activity of collecting the information about the execution characteristics of a program. Performance tools consists of at least three interfacing software layers, including instrumentation, measurement, and analysis. The instrumentation layer defines the measured performance events. The measurement layer determines what performance event is actually captured and how it is measured by the tool. The analysis layer processes the performance data and summarizes it into a form that can be displayed in performance tools. In this paper, a number of OpenMP performance tools are surveyed, explaining how each is used to collect, analyse, and display data collection.Keywords: Parallel performance tools, OpenMP, multi-core.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19223904 Multidimensional Visualization Tools for Analysis of Expression Data
Authors: Urska Cvek, Marjan Trutschl, Randolph Stone II, Zanobia Syed, John L. Clifford, Anita L. Sabichi
Abstract:
Expression data analysis is based mostly on the statistical approaches that are indispensable for the study of biological systems. Large amounts of multidimensional data resulting from the high-throughput technologies are not completely served by biostatistical techniques and are usually complemented with visual, knowledge discovery and other computational tools. In many cases, in biological systems we only speculate on the processes that are causing the changes, and it is the visual explorative analysis of data during which a hypothesis is formed. We would like to show the usability of multidimensional visualization tools and promote their use in life sciences. We survey and show some of the multidimensional visualization tools in the process of data exploration, such as parallel coordinates and radviz and we extend them by combining them with the self-organizing map algorithm. We use a time course data set of transitional cell carcinoma of the bladder in our examples. Analysis of data with these tools has the potential to uncover additional relationships and non-trivial structures.Keywords: microarrays, visualization, parallel coordinates, radviz, self-organizing maps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25083903 The Potential of Digital Tools in Art Lessons at Junior School Level to Improve Artistic Ability Using Tamazight Fonts
Authors: Aber Salem Aboalgasm, Rupert Ward
Abstract:
The aim of this research is to explore how pupils in art classes can use creative digital art tools to redesign Tamazight fonts, in order to develop children’s artistic creativity, enable them to learn about a new culture, and to help the teacher assess the creativity of pupils in the art class. It can also help students to improve their talents in drawing. The study could relate to research in Libya among the Amazigh people (better known as Berber) and possibly the development of Tamazight fonts with new uses in art. The research involved students aged 9-10 years old working with digital art tools, and was designed to explore the potential of digital technology by discovering suitable tools and techniques to develop children’s artistic performance using Tamazight fonts. The project also sought to show the aesthetic aspects of these characters and to stimulate the artistic creativity of these young people.
Keywords: Artistic creativity, Tamazight fonts, Technology acceptance model, Traditional and digital art tools.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18993902 An Exploratory Study of the Student’s Learning Experience by Applying Different Tools for e-Learning and e-Teaching
Authors: Angel Daniel Muñoz Guzmán
Abstract:
E-learning is becoming more and more common every day. For online, hybrid or traditional face-to-face programs, there are some e-teaching platforms like Google classroom, Blackboard, Moodle and Canvas, and there are platforms for full e-learning like Coursera, edX or Udemy. These tools are changing the way students acquire knowledge at schools; however, in today’s changing world that is not enough. As students’ needs and skills change and become more complex, new tools will need to be added to keep them engaged and potentialize their learning. This is especially important in the current global situation that is changing everything: the Covid-19 pandemic. Due to Covid-19, education had to make an unexpected switch from face-to-face courses to digital courses. In this study, the students’ learning experience is analyzed by applying different e-tools and following the Tec21 Model and a flexible and digital model, both developed by the Tecnologico de Monterrey University. The evaluation of the students’ learning experience has been made by the quantitative PrEmo method of emotions. Findings suggest that the quantity of e-tools used during a course does not affect the students’ learning experience as much as how a teacher links every available tool and makes them work as one in order to keep the student engaged and motivated.Keywords: Student, experience, e-learning, e-teaching, e-tools, technology, education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7563901 Assertion-Driven Test Repair Based on Priority Criteria
Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang
Abstract:
Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent-preservation has been proposed, it does not take into account the association between test repairs and assertions, leading a large number of irrelevant candidates and decreasing the repair capability. This paper proposes a assertion-driven test repair approach. Furthermore, a intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) broken test cases, which are more effective than the existing intent-preserved test repair approach, and our intent-oriented priority criteria work well.
Keywords: Test repair, test intent, software test, test case evolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553900 Computer Assisted Learning in a Less Resource Region
Authors: Hamidullah Sokout, Samiullah Paracha, Abdul Rashid Ahmadi
Abstract:
Passing the entrance exam to a university is a major step in one's life. University entrance exam commonly known as Kankor is the nationwide entrance exam in Afghanistan. This examination is prerequisite for all public and private higher education institutions at undergraduate level. It is usually taken by students who are graduated from high schools. In this paper, we reflect the major educational school graduates issues and propose ICT-based test preparation environment, known as ‘Online Kankor Exam Prep System’ to give students the tools to help them pass the university entrance exam on the first try. The system is based on Intelligent Tutoring System (ITS), which introduced an essential package of educational technology for learners that features: (I) exam-focused questions and content; (ii) self-assessment environment; and (iii) test preparation strategies in order to help students to acquire the necessary skills in their carrier and keep them up-to-date with instruction.
Keywords: Web-based test prep systems, Learner-centered design, E-Learning, Intelligent tutoring system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19523899 Students’ Perceptions of the Use of Social Media in Higher Education in Saudi Arabia
Authors: Omar Alshehri, Vic Lally
Abstract:
This paper examined the attitudes of using social media tools to support learning at a university in Saudi Arabia. Moreover, it investigated the students’ current usage of these tools and examined the barriers they could face during the use of social media tools in the education process. Participants in this study were 42 university students. A web-based survey was used to collect data for this study. The results indicate that all of the students were familiar with social media and had used at least one type of social media for learning. It was found out that all students had very positive attitudes towards the use of social media and welcomed using these tools as a supplementary to the curriculum. However, the results indicated that the major barriers to using these tools in learning were distraction, opposing Islamic religious teachings, privacy issues, and cyberbullying. The study recommended that this study could be replicated at other Saudi universities to investigate factors and barriers that might affect Saudi students’ attitudes toward using social media to support learning.Keywords: Saudi Arabia, social media, benefits of social media use, barriers to social media use, higher education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23493898 Improving the Effectiveness of Software Testing through Test Case Reduction
Authors: R. P. Mahapatra, Jitendra Singh
Abstract:
This paper proposes a new technique for improving the efficiency of software testing, which is based on a conventional attempt to reduce test cases that have to be tested for any given software. The approach utilizes the advantage of Regression Testing where fewer test cases would lessen time consumption of the testing as a whole. The technique also offers a means to perform test case generation automatically. Compared to one of the techniques in the literature where the tester has no option but to perform the test case generation manually, the proposed technique provides a better option. As for the test cases reduction, the technique uses simple algebraic conditions to assign fixed values to variables (Maximum, minimum and constant variables). By doing this, the variables values would be limited within a definite range, resulting in fewer numbers of possible test cases to process. The technique can also be used in program loops and arrays.Keywords: Software Testing, Test Case Generation, Test CaseReduction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30173897 A Promising Approach to Supporting Knowledge-Intensive Business Processes: Business Case Management
Authors: Zeljko Panian
Abstract:
Through the course of this paper we define Business Case Management and its characteristics, and highlight its link to knowledge workers. Business Case Management combines knowledge and process effectively, supporting the ad hoc and unpredictable nature of cases, and coordinate a range of other technologies to appropriately support knowledge-intensive processes. We emphasize the growing importance of knowledge workers and the current poor support for knowledge work automation. We also discuss the challenges in supporting this kind of knowledge work and propose a novel approach to overcome these challenges.
Keywords: Knowledge management, knowledge workers, business process management, business case management, automation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21743896 Reducing Test Vectors Count Using Fault Based Optimization Schemes in VLSI Testing
Authors: Vinod Kumar Khera, R. K. Sharma, A. K. Gupta
Abstract:
Power dissipation increases exponentially during test mode as compared to normal operation of the circuit. In extreme cases, test power is more than twice the power consumed during normal operation mode. Test vector generation scheme is key component in deciding the power hungriness of a circuit during testing. Test vector count and consequent leakage current are functions of test vector generation scheme. Fault based test vector count optimization has been presented in this work. It helps in reducing test vector count and the leakage current. In the presented scheme, test vectors have been reduced by extracting essential child vectors. The scheme has been tested experimentally using stuck at fault models and results ensure the reduction in test vector count.Keywords: Low power VLSI testing, independent fault, essential faults, test vector reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14243895 Determination and Comparison of Fabric Pills Distribution Using Image Processing and Spatial Data Analysis Tools
Authors: Lenka Techniková, Maroš Tunák, Jiří Janáček
Abstract:
This work deals with the determination and comparison of pill patterns in 2 sets of fabric samples which differ in way of pill creation. The first set contains fabric samples with the pills created by simulation on a Martindale abrasion machine, while pills in the second set originated during normal wearing and maintenance. The goal of the study is to determine whether the pattern of the fabric pills created by simulation is the same as the pattern of naturally occurring pills. The system of determination and comparison of the pills is based on image processing and spatial data analysis tools. Firstly, 3D reconstruction of the fabric surfaces with the pills is realized with using a gradient fields method. The gradient fields method creates a 3D fabric surface from a set of 4 images. Thereafter, the pills are detected in 3D fabric surfaces using image-processing tools in the MATLAB software. Determination and comparison of the pills patterns of two sets of fabric samples is based on spatial data analysis using tools in R software.
Keywords: 3D reconstruction of the surface, image analysis tools, distribution of the pills, spatial data analysis tools.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21733894 Performance Evaluation of an Aboveground LNG Storage Tank Cover using Nondestructive and Destructive Tests
Authors: Sungnam Hong, Sun-Kyu Park, Jieun Jeong, Jinwoong Choi
Abstract:
In this study, a new procedure for inspecting damages on LNG storage tanks was proposed with the use of structural diagnostic techniques: i.e., nondestructive inspection techniques such as macrography, the hammer sounding test, the Schmidt hammer test, and the ultrasonic pulse velocity test, and destructive inspection techniques such as the compressive strength test, the chloride penetration test, and the carbonation test. From the analysis of all the test results, it was concluded that the LNG storage tank cover was in good condition. Such results were also compared with the Korean concrete standard specifications and design values. In addition, the remaining life of the LNG storage tank was estimated by using existing models. Based on the results, an LNG storage tank cover performance evaluation procedure was suggested.
Keywords: Destructive test, LNG storage tank, Nondestructive test, Performance evaluation procedure, Remaining life.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31903893 Using ε Value in Describe Regular Languages by Using Finite Automata, Operation on Languages and the Changing Algorithm Implementation
Authors: Abdulmajid Mukhtar Afat
Abstract:
This paper aims at introducing nondeterministic finite automata with ε value which is used to perform some operations on languages. a program is created to implement the algorithm that converts nondeterministic finite automata with ε value (ε-NFA) to deterministic finite automata (DFA).The program is written in c++ programming language. The program inputs are FA 5-tuples from text file and then classifies it into either DFA/NFA or ε -NFA. For DFA, the program will get the string w and decide whether it is accepted or rejected. The tracking path for an accepted string is saved by the program. In case of NFA or ε-NFA automation, the program changes the automation to DFA to enable tracking and to decide if the string w exists in the regular language or not.
Keywords: Finite automata, DFA, NFA, ε-NFA, Eclose, operations on languages.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8373892 Scope and Application of Collaborative Tools and Digital Manufacturing in Dentistry
Authors: S. Mohan Kumar, Rajashekar Patil, Tanuja Ajit Desphande
Abstract:
It is necessary to incorporate technological advances achieved in the field of engineering into dentistry in order to enhance the process of diagnosis, treatment planning and enable the doctors to render better treatment to their patients. To achieve this ultimate goal long distance collaborations are often necessary. This paper discusses the various collaborative tools and their applications to solve a few burning problems confronted by the dentists. Customization is often the solution to most of the problems. But rapid designing, development and cost effective manufacturing is a difficult task to achieve. This problem can be solved using the technique of digital manufacturing. Cases from 6 major branches of dentistry have been discussed and possible solutions with the help of state of art technology using rapid digital manufacturing have been proposed in the present paper. The paper also entails the usage of existing tools in collaborative and digital manufacturing area.Keywords: Customisation, collaborative tools, dentistry, digital manufacturing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18653891 Proposing a Conceptual Model of Customer Knowledge Management: A Study of CKM Tools in British Dotcoms
Authors: Mehdi Shami Zanjani, Roshanak Rouzbehani, Hosein Dabbagh
Abstract:
Although current competitive challenges induced by today-s digital economy place their main emphasis on organizational knowledge, customer knowledge has been overlooked. On the other hand, the business community has finally begun to realize the important role customer knowledge can play in the organizational boundaries of the corporate arena. As a result, there is an emerging market for the tools and utilities whose objective is to provide the intelligence for knowledge sharing between the businesses and their customers. In this paper, we present a conceptual model of customer knowledge management by identifying and analyzing the existing tools in the market. The focus will be upon the emerging British dotcom industry whose customer based B2C behavior has been an influential part of the knowledge based intelligence tools in existence today.
Keywords: Customer knowledge, customer knowledge management, knowledge management, B2C E-commerce.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33923890 Analysis of Factors Used by Farmers to Manage Risk: A Case Study on Italian Farms
Authors: A. Pontrandolfi, G. Enjolras, F. Capitanio
Abstract:
The study analyses the strategies Italian farmers use to cope with the risks that face their production. We specifically explore the potential and the limitations of the economic tools for climatic risk management in agriculture of the Common Agricultural Policy 2014-2020, that foresees contributions for economic tools for risk management, in relation to farms’ needs, exposure and vulnerability of agricultural areas to climatic risk. We consider at the farm level approaches to hedge risks in terms of the use of technical tools (agricultural practices, pesticides, fertilizers, irrigation) and economic/financial instruments (insurances, etc.). We develop cross-sectional and longitudinal analyses as well as analyses of correlation that underline the main differences between the way farms adapt their structure and management towards risk. The results show a preference for technical tools, despite the presence of important public aids on economic tools such as insurances. Therefore, there is a strong need for a more effective and integrated risk management policy scheme. Synergies between economic tools and risk reduction actions of a more technical, structural and management nature (production diversification, irrigation infrastructures, technological and management innovations and formation-information-consultancy, etc.) are emphasized.Keywords: Agriculture and climate change, climatic risk management, insurance schemes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12173889 Exploring the Combinatorics of Motif Alignments Foraccurately Computing E-values from P-values
Authors: T. Kjosmoen, T. Ryen, T. Eftestøl
Abstract:
In biological and biomedical research motif finding tools are important in locating regulatory elements in DNA sequences. There are many such motif finding tools available, which often yield position weight matrices and significance indicators. These indicators, p-values and E-values, describe the likelihood that a motif alignment is generated by the background process, and the expected number of occurrences of the motif in the data set, respectively. The various tools often estimate these indicators differently, making them not directly comparable. One approach for comparing motifs from different tools, is computing the E-value as the product of the p-value and the number of possible alignments in the data set. In this paper we explore the combinatorics of the motif alignment models OOPS, ZOOPS, and ANR, and propose a generic algorithm for computing the number of possible combinations accurately. We also show that using the wrong alignment model can give E-values that significantly diverge from their true values.
Keywords: Motif alignment, combinatorics, p-value, E-value, OOPS, ZOOPS, ANR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12113888 Simulation Tools for Fixed Point DSP Algorithms and Architectures
Authors: K. B. Cullen, G. C. M. Silvestre, N. J. Hurley
Abstract:
This paper presents software tools that convert the C/Cµ floating point source code for a DSP algorithm into a fixedpoint simulation model that can be used to evaluate the numericalperformance of the algorithm on several different fixed pointplatforms including microprocessors, DSPs and FPGAs. The tools use a novel system for maintaining binary point informationso that the conversion from floating point to fixed point isautomated and the resulting fixed point algorithm achieves maximum possible precision. A configurable architecture is used during the simulation phase so that the algorithm can produce a bit-exact output for several different target devices.
Keywords: DSP devices, DSP algorithm, simulation model, software
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2551