Search results for: paper pencil based testing
44881 Integrating Data Mining with Case-Based Reasoning for Diagnosing Sorghum Anthracnose
Authors: Mariamawit T. Belete
Abstract:
Cereal production and marketing are the means of livelihood for millions of households in Ethiopia. However, cereal production is constrained by technical and socio-economic factors. Among the technical factors, cereal crop diseases are the major contributing factors to the low yield. The aim of this research is to develop an integration of data mining and knowledge based system for sorghum anthracnose disease diagnosis that assists agriculture experts and development agents to make timely decisions. Anthracnose diagnosing systems gather information from Melkassa agricultural research center and attempt to score anthracnose severity scale. Empirical research is designed for data exploration, modeling, and confirmatory procedures for testing hypothesis and prediction to draw a sound conclusion. WEKA (Waikato Environment for Knowledge Analysis) was employed for the modeling. Knowledge based system has come across a variety of approaches based on the knowledge representation method; case-based reasoning (CBR) is one of the popular approaches used in knowledge-based system. CBR is a problem solving strategy that uses previous cases to solve new problems. The system utilizes hidden knowledge extracted by employing clustering algorithms, specifically K-means clustering from sampled anthracnose dataset. Clustered cases with centroid value are mapped to jCOLIBRI, and then the integrator application is created using NetBeans with JDK 8.0.2. The important part of a case based reasoning model includes case retrieval; the similarity measuring stage, reuse; which allows domain expert to transfer retrieval case solution to suit for the current case, revise; to test the solution, and retain to store the confirmed solution to the case base for future use. Evaluation of the system was done for both system performance and user acceptance. For testing the prototype, seven test cases were used. Experimental result shows that the system achieves an average precision and recall values of 70% and 83%, respectively. User acceptance testing also performed by involving five domain experts, and an average of 83% acceptance is achieved. Although the result of this study is promising, however, further study should be done an investigation on hybrid approach such as rule based reasoning, and pictorial retrieval process are recommended.Keywords: sorghum anthracnose, data mining, case based reasoning, integration
Procedia PDF Downloads 8144880 Testing the Capital Structure Behavior of Malaysian Firms: Shariah vs. Non-Shariah Compliant
Authors: Asyraf Abdul Halim, Mohd Edil Abd Sukor, Obiyathulla Ismath Bacha
Abstract:
This paper attempts to investigate the capital structure behavior of Shariah compliant firms of various levels as well those firms who are consistently Shariah non-compliant in Malaysia. The paper utilizes a unique dataset of firms of the heterogeneous level of Shariah-compliancy status over a 20 year period from the year 1997 to 2016. The paper focuses on the effects of dynamic forces behind capital structure variation such as the optimal capital structure behavior based on the trade-off, pecking order, market timing and firmly fixed effect models of capital structure. This study documents significant evidence in support of the trade-off theory with a high speed of adjustment (SOA) as well as for the time-invariant firm fixed effects across all Shariah compliance group.Keywords: capital structure, market timing, trade-off theory, equity risk premium, Shariah-compliant firms
Procedia PDF Downloads 31244879 User-Based Cannibalization Mitigation in an Online Marketplace
Authors: Vivian Guo, Yan Qu
Abstract:
Online marketplaces are not only digital places where consumers buy and sell merchandise, and they are also destinations for brands to connect with real consumers at the moment when customers are in the shopping mindset. For many marketplaces, brands have been important partners through advertising. There can be, however, a risk of advertising impacting a consumer’s shopping journey if it hurts the use experience or takes the user away from the site. Both could lead to the loss of transaction revenue for the marketplace. In this paper, we present user-based methods for cannibalization control by selectively turning off ads to users who are likely to be cannibalized by ads subject to business objectives. We present ways of measuring cannibalization of advertising in the context of an online marketplace and propose novel ways of measuring cannibalization through purchase propensity and uplift modeling. A/B testing has shown that our methods can significantly improve user purchase and engagement metrics while operating within business objectives. To our knowledge, this is the first paper that addresses cannibalization mitigation at the user-level in the context of advertising.Keywords: cannibalization, machine learning, online marketplace, revenue optimization, yield optimization
Procedia PDF Downloads 16044878 Fuzzy Based Stabilizer Control System for Quad-Rotor
Authors: B. G. Sampath, K. C. R. Perera, W. A. S. I. Wijesuriya, V. P. C. Dassanayake
Abstract:
In this paper the design, development and testing of a stabilizer control system for a Quad-rotor is presented which is focused on the maneuverability. The mechanical design is performed along with the design of the controlling algorithm which is devised using fuzzy logic controller. The inputs for the system are the angular positions and angular rates of the Quad-Rotor relative to three axes. Then the output data is filtered from an accelerometer and a gyroscope through a Kalman filter. In the development of the stability controlling system Mandani Fuzzy Model is incorporated. The results prove that the fuzzy based stabilizer control system is superior in high dynamic disturbances compared to the traditional systems which use PID integrated stabilizer control systems.Keywords: fuzzy stabilizer, maneuverability, PID, quad-rotor
Procedia PDF Downloads 32144877 Modeling and Simulation for 3D Eddy Current Testing in Conducting Materials
Authors: S. Bennoud, M. Zergoug
Abstract:
The numerical simulation of electromagnetic interactions is still a challenging problem, especially in problems that result in fully three dimensional mathematical models. The goal of this work is to use mathematical modeling to characterize the reliability and capacity of eddy current technique to detect and characterize defects embedded in aeronautical in-service pieces. The finite element method is used for describing the eddy current technique in a mathematical model by the prediction of the eddy current interaction with defects. However, this model is an approximation of the full Maxwell equations. In this study, the analysis of the problem is based on a three dimensional finite element model that computes directly the electromagnetic field distortions due to defects.Keywords: eddy current, finite element method, non destructive testing, numerical simulations
Procedia PDF Downloads 44344876 Temperature Susceptibility of Multigrade Bitumen Asphalt and an Approach to Account for Temperature Variation through Deep Pavements
Authors: Brody R. Clark, Chaminda Gallage, John Yeaman
Abstract:
Multigrade bitumen asphalt is a quality asphalt product that is not utilised in many places globally. Multigrade bitumen is believed to be less sensitive to temperature, which gives it an advantage over conventional binders. Previous testing has shown that asphalt temperature changes greatly with depth, but currently the industry standard is to nominate a single temperature for design. For detailed design of asphalt roads, perhaps asphalt layers should be divided into nominal layer depths and different modulus and fatigue equations/values should be used to reflect the temperatures of each respective layer. A collaboration of previous laboratory testing conducted on multigrade bitumen asphalt beams under a range of temperatures and loading conditions was analysed. The samples tested included 0% or 15% recycled asphalt pavement (RAP) to determine what impact the recycled material has on the fatigue life and stiffness of the pavement. This paper investigated the temperature susceptibility of multigrade bitumen asphalt pavements compared to conventional binders by combining previous testing that included conducting a sweep of fatigue tests, developing complex modulus master curves for each mix and a study on how pavement temperature changes through pavement depth. This investigation found that the final design of the pavement is greatly affected by the nominated pavement temperature and respective material properties. This paper has outlined a potential revision to the current design approach for asphalt pavements and proposes that further investigation is needed into pavement temperature and its incorporation into design.Keywords: asphalt, complex modulus, fatigue life, flexural stiffness, four point bending, multigrade bitumen, recycled asphalt pavement
Procedia PDF Downloads 37644875 Phenomenological Ductile Fracture Criteria Applied to the Cutting Process
Authors: František Šebek, Petr Kubík, Jindřich Petruška, Jiří Hůlka
Abstract:
Present study is aimed on the cutting process of circular cross-section rods where the fracture is used to separate one rod into two pieces. Incorporating the phenomenological ductile fracture model into the explicit formulation of finite element method, the process can be analyzed without the necessity of realizing too many real experiments which could be expensive in case of repetitive testing in different conditions. In the present paper, the steel AISI 1045 was examined and the tensile tests of smooth and notched cylindrical bars were conducted together with biaxial testing of the notched tube specimens to calibrate material constants of selected phenomenological ductile fracture models. These were implemented into the Abaqus/Explicit through user subroutine VUMAT and used for cutting process simulation. As the calibration process is based on variables which cannot be obtained directly from experiments, numerical simulations of fracture tests are inevitable part of the calibration. Finally, experiments regarding the cutting process were carried out and predictive capability of selected fracture models is discussed. Concluding remarks then make the summary of gained experience both with the calibration and application of particular ductile fracture criteria.Keywords: ductile fracture, phenomenological criteria, cutting process, explicit formulation, AISI 1045 steel
Procedia PDF Downloads 45744874 Rest API Based System-level Test Automation for Mobile Applications
Authors: Jisoo Song
Abstract:
Today’s mobile applications are communicating with servers more and more in order to access external services or information. Also, server-side code changes are more frequent than client-side code changes in a mobile application. The frequent changes lead to an increase in testing cost increase. To reduce costs, UI based test automation can be one of the solutions. It is a common automation technique in system-level testing. However, it can be unsuitable for mobile applications. When you automate tests based on UI elements for mobile applications, there are some limitations such as the overhead of script maintenance or the difficulty of finding invisible defects that UI elements cannot represent. To overcome these limitations, we present a new automation technique based on Rest API. You can automate system-level tests through test scripts that you write. These scripts call a series of Rest API in a user’s action sequence. This technique does not require testers to know the internal implementation details, only input and expected output of Rest API. You can easily modify test cases by modifying Rest API input values and also find problems that might not be evident from the UI level by validating output values. For example, when an application receives price information from a payment server and user cannot see it at UI level, Rest API based scripts can check whether price information is correct or not. More than 10 mobile applications at our company are being tested automatically based on Rest API scripts whenever application source code, mostly server source code, is built. We are finding defects right away by setting a script as a build job in CI server. The build job starts when application code builds are completed. This presentation will also include field cases from our company.Keywords: case studies at SK Planet, introduction of rest API based test automation, limitations of UI based test automation
Procedia PDF Downloads 44844873 Student Records Management System Using Smart Cards and Biometric Technology for Educational Institutions
Authors: Patrick O. Bobbie, Prince S. Attrams
Abstract:
In recent times, the rapid change in new technologies has spurred up the way and manner records are handled in educational institutions. Also, there is a need for reliable access and ease-of use to these records, resulting in increased productivity in organizations. In academic institutions, such benefits help in quality assessments, institutional performance, and assessments of teaching and evaluation methods. Students in educational institutions benefit the most when advanced technologies are deployed in accessing records. This research paper discusses the use of biometric technologies coupled with smartcard technologies to provide a unique way of identifying students and matching their data to financial records to grant them access to restricted areas such as examination halls. The system developed in this paper, has an identity verification component as part of its main functionalities. A systematic software development cycle of analysis, design, coding, testing and support was used. The system provides a secured way of verifying student’s identity and real time verification of financial records. An advanced prototype version of the system has been developed for testing purposes.Keywords: biometrics, smartcards, identity-verification, fingerprints
Procedia PDF Downloads 41944872 Using Historical Data for Stock Prediction
Authors: Sofia Stoica
Abstract:
In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.Keywords: finance, machine learning, opening price, stock market
Procedia PDF Downloads 18944871 Crafting Robust Business Model Innovation Path with Generative Artificial Intelligence in Start-up SMEs
Authors: Ignitia Motjolopane
Abstract:
Small and medium enterprises (SMEs) play an important role in economies by contributing to economic growth and employment. In the fourth industrial revolution, the convergence of technologies and the changing nature of work created pressures on economies globally. Generative artificial intelligence (AI) may support SMEs in exploring, exploiting, and transforming business models to align with their growth aspirations. SMEs' growth aspirations fall into four categories: subsistence, income, growth, and speculative. Subsistence-oriented firms focus on meeting basic financial obligations and show less motivation for business model innovation. SMEs focused on income, growth, and speculation are more likely to pursue business model innovation to support growth strategies. SMEs' strategic goals link to distinct business model innovation paths depending on whether SMEs are starting a new business, pursuing growth, or seeking profitability. Integrating generative artificial intelligence in start-up SME business model innovation enhances value creation, user-oriented innovation, and SMEs' ability to adapt to dynamic changes in the business environment. The existing literature may lack comprehensive frameworks and guidelines for effectively integrating generative AI in start-up reiterative business model innovation paths. This paper examines start-up business model innovation path with generative artificial intelligence. A theoretical approach is used to examine start-up-focused SME reiterative business model innovation path with generative AI. Articulating how generative AI may be used to support SMEs to systematically and cyclically build the business model covering most or all business model components and analyse and test the BM's viability throughout the process. As such, the paper explores generative AI usage in market exploration. Moreover, market exploration poses unique challenges for start-ups compared to established companies due to a lack of extensive customer data, sales history, and market knowledge. Furthermore, the paper examines the use of generative AI in developing and testing viable value propositions and business models. In addition, the paper looks into identifying and selecting partners with generative AI support. Selecting the right partners is crucial for start-ups and may significantly impact success. The paper will examine generative AI usage in choosing the right information technology, funding process, revenue model determination, and stress testing business models. Stress testing business models validate strong and weak points by applying scenarios and evaluating the robustness of individual business model components and the interrelation between components. Thus, the stress testing business model may address these uncertainties, as misalignment between an organisation and its environment has been recognised as the leading cause of company failure. Generative AI may be used to generate business model stress-testing scenarios. The paper is expected to make a theoretical and practical contribution to theory and approaches in crafting a robust business model innovation path with generative artificial intelligence in start-up SMEs.Keywords: business models, innovation, generative AI, small medium enterprises
Procedia PDF Downloads 7044870 A Comparative Study between FEM and Meshless Methods
Authors: Jay N. Vyas, Sachin Daxini
Abstract:
Numerical simulation techniques are widely used now in product development and testing instead of expensive, time-consuming and sometimes dangerous laboratory experiments. Numerous numerical methods are available for performing simulation of physical problems of different engineering fields. Grid based methods, like Finite Element Method, are extensively used in performing various kinds of static, dynamic, structural and non-structural analysis during product development phase. Drawbacks of grid based methods in terms of discontinuous secondary field variable, dealing fracture mechanics and large deformation problems led to development of a relatively a new class of numerical simulation techniques in last few years, which are popular as Meshless methods or Meshfree Methods. Meshless Methods are expected to be more adaptive and flexible than Finite Element Method because domain descretization in Meshless Method requires only nodes. Present paper introduces Meshless Methods and differentiates it with Finite Element Method in terms of following aspects: Shape functions used, role of weight function, techniques to impose essential boundary conditions, integration techniques for discrete system equations, convergence rate, accuracy of solution and computational effort. Capabilities, benefits and limitations of Meshless Methods are discussed and concluded at the end of paper.Keywords: numerical simulation, Grid-based methods, Finite Element Method, Meshless Methods
Procedia PDF Downloads 38944869 Dynamic Amplification Factors of Some City Bridges
Authors: I. Paeglite, A. Paeglitis
Abstract:
The paper presents a study of dynamic effects obtained from the dynamic load testing of the city highway bridges in Latvia carried out from 2005 to 2012. 9 pre-stressed concrete bridges and 4 composite bridges were considered. 11 of 13 bridges were designed according to the Eurocodes but two according to the previous structural codes used in Latvia (SNIP 2.05.03-84). The dynamic properties of the bridges were obtained by heavy vehicles passing the bridge roadway with different driving speeds and with or without even pavement. The obtained values of the Dynamic amplification factor (DAF) and bridge natural frequency were analyzed and compared to the values of built-in traffic load models provided in Eurocode 1. The actual DAF values for even bridge deck in the most cases are smaller than the value adopted in Eurocode 1. Vehicle speed for uneven pavements significantly influence Dynamic amplification factor values.Keywords: bridge, dynamic effects, load testing, dynamic amplification factor
Procedia PDF Downloads 38344868 Simulation and Experimentation Investigation of Infrared Non-Destructive Testing on Thermal Insulation Material
Authors: Bi Yan-Qiang, Shang Yonghong, Lin Boying, Ji Xinyan, Li Xiyuan
Abstract:
The heat-resistant material has important application in the aerospace field. The reliability of the connection between the heat-resisting material and the body determines the success or failure of the project. In this paper, lock-in infrared thermography non-destructive testing technology is used to detect the stability of the thermal-resistant structure. The phase relationship between the temperature and the heat flow is calculated by the numerical method, and the influence of the heating frequency and power is obtained. The correctness of the analysis is verified by the experimental method. Through the research, it can provide the basis for the parameter setting of heat flux including frequency and power, improve the efficiency of detection and the reliability of connection between the heat-resisting material and the body.Keywords: infrared non-destructive, thermal insulation material, reliability, connection
Procedia PDF Downloads 38544867 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics
Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo
Abstract:
Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model
Procedia PDF Downloads 15544866 Analysis of the Annual Proficiency Testing Procedure for Intermediate Reference Laboratories Conducted by the National Reference Laboratory from 2013 to 2017
Authors: Reena K., Mamatha H. G., Somshekarayya, P. Kumar
Abstract:
Objectives: The annual proficiency testing of intermediate reference laboratories is conducted by the National Reference Laboratory (NRL) to assess the efficiency of the laboratories to correctly identify Mycobacterium tuberculosis and to determine its drug susceptibility pattern. The proficiency testing results from 2013 to 2017 were analyzed to determine laboratories that were consistent in reporting quality results and those that had difficulty in doing so. Methods: A panel of twenty cultures were sent out to each of these laboratories. The laboratories were expected to grow the cultures in their own laboratories, set up drug susceptibly testing by all the methods they were certified for and report the results within the stipulated time period. The turnaround time for reporting results, specificity, sensitivity positive and negative predictive values and efficiency of the laboratory in identifying the cultures were analyzed. Results: Most of the laboratories had reported their results within the stipulated time period. However, there was enormous delay in reporting results from few of the laboratories. This was mainly due to improper functioning of the biosafety level III laboratory. Only 40% of the laboratories had 100% efficiency in solid culture using Lowenstein Jensen medium. This was expected as a solid culture, and drug susceptibility testing is not used for diagnosing drug resistance. Rapid molecular methods such as Line probe assay and Genexpert are used to determine drug resistance. Automated liquid culture system such as the Mycobacterial growth indicator tube is used to determine prognosis of the patient while on treatment. It was observed that 90% of the laboratories had achieved 100% in the liquid culture method. Almost all laboratories had achieved 100% efficiency in the line probe assay method which is the method of choice for determining drug-resistant tuberculosis. Conclusion: Since the liquid culture and line probe assay technologies are routinely used for the detection of drug-resistant tuberculosis the laboratories exhibited higher level of efficiency as compared to solid culture and drug susceptibility testing which are rarely used. The infrastructure of the laboratory should be maintained properly so that samples can be processed safely and results could be declared on time.Keywords: annual proficiency testing, drug susceptibility testing, intermediate reference laboratory, national reference laboratory
Procedia PDF Downloads 18144865 Conceptual Design of Experimental Helium Cooling Loop for Indian TBM R&D Experiments
Authors: B. K. Yadav, A. Gandhi, A. K. Verma, T. S. Rao, A. Saraswat, E. R. Kumar, M. Sarkar, K. N. Vyas
Abstract:
This paper deals with the conceptual design of Experimental Helium Cooling Loop (EHCL) for Indian Test Blanket Module (TBM) and its related thermal hydraulic experiments. Indian TBM team is developing Lead Lithium cooled Ceramic Breeder (IN-LLCB) TBM to be tested in ITER. The TBM box structure is cooled by high pressure (8 MPa) and high temperature (300-500C) helium gas. The first wall of TBM made of complex channel geometry having several parallel channels carrying helium gas for efficient heat extraction. Several mock-ups of these channels need to be tested before finalizing the TBM first wall design and fabrication. Besides the individual testing of such mock-ups of breeding blanket, the testing of Pb-Li to helium heat exchanger, the operational experience of helium loop and understanding of the behaviour of high pressure and high temperature system components are very essential for final development of Helium Cooling System for LLCB TBM in ITER. The main requirements and characteristics of the EHCL and its conceptual design are presented in this paper.Keywords: DEMO, EHCL, ITER, LLCB TBM
Procedia PDF Downloads 38344864 Non-Destructive Testing of Carbon Fiber Reinforced Plastic by Infrared Thermography Methods
Authors: W. Swiderski
Abstract:
Composite materials are one answer to the growing demand for materials with better parameters of construction and exploitation. Composite materials also permit conscious shaping of desirable properties to increase the extent of reach in the case of metals, ceramics or polymers. In recent years, composite materials have been used widely in aerospace, energy, transportation, medicine, etc. Fiber-reinforced composites including carbon fiber, glass fiber and aramid fiber have become a major structural material. The typical defect during manufacture and operation is delamination damage of layered composites. When delamination damage of the composites spreads, it may lead to a composite fracture. One of the many methods used in non-destructive testing of composites is active infrared thermography. In active thermography, it is necessary to deliver energy to the examined sample in order to obtain significant temperature differences indicating the presence of subsurface anomalies. To detect possible defects in composite materials, different methods of thermal stimulation can be applied to the tested material, these include heating lamps, lasers, eddy currents, microwaves or ultrasounds. The use of a suitable source of thermal stimulation on the test material can have a decisive influence on the detection or failure to detect defects. Samples of multilayer structure carbon composites were prepared with deliberately introduced defects for comparative purposes. Very thin defects of different sizes and shapes made of Teflon or copper having a thickness of 0.1 mm were screened. Non-destructive testing was carried out using the following sources of thermal stimulation, heating lamp, flash lamp, ultrasound and eddy currents. The results are reported in the paper.Keywords: Non-destructive testing, IR thermography, composite material, thermal stimulation
Procedia PDF Downloads 25944863 Three Dimensional Analysis of Cubesat Thermal Vacuum Test
Authors: Maged Assem Soliman Mossallam
Abstract:
Thermal vacuum testing target is to qualify the space system and ensure its operability under harsh space environment. The functionality of the cubesat was checked at extreme orbit conditions. Test was performed for operational and nonoperational modes. Analysis is done to simulate the cubesat thermal cycling inside thermal vacuum chamber. Comsol Multiphysics finite element is used to solve three dimensional problem for the cubesat inside TVAC. Three dimensional CAD model is done using Autodesk Inventor program. The boundary conditions were applied from the actual shroud temperature. The input heat load variation with time is considered to solve the transient three dimensional problem. Results show that the simulated temperature profiles are within an acceptable range from the real testing data.Keywords: cubesat, thermal vacuum test, testing simulation, finite element analysis
Procedia PDF Downloads 15144862 Tree-Based Inference for Regionalization: A Comparative Study of Global Topological Perturbation Methods
Authors: Orhun Aydin, Mark V. Janikas, Rodrigo Alves, Renato Assuncao
Abstract:
In this paper, a tree-based perturbation methodology for regionalization inference is presented. Regionalization is a constrained optimization problem that aims to create groups with similar attributes while satisfying spatial contiguity constraints. Similar to any constrained optimization problem, the spatial constraint may hinder convergence to some global minima, resulting in spatially contiguous members of a group with dissimilar attributes. This paper presents a general methodology for rigorously perturbing spatial constraints through the use of random spanning trees. The general framework presented can be used to quantify the effect of the spatial constraints in the overall regionalization result. We compare several types of stochastic spanning trees used in inference problems such as fuzzy regionalization and determining the number of regions. Performance of stochastic spanning trees is juxtaposed against the traditional permutation-based hypothesis testing frequently used in spatial statistics. Inference results for fuzzy regionalization and determining the number of regions is presented on the Local Area Personal Incomes for Texas Counties provided by the Bureau of Economic Analysis.Keywords: regionalization, constrained clustering, probabilistic inference, fuzzy clustering
Procedia PDF Downloads 22844861 Faculty Attendance Management System (FAMS)
Authors: G. C. Almiranez, J. Mercado, L. U. Aumentado, J. M. Mahaguay, J. P. Cruz, M. L. Saballe
Abstract:
This research project focused on the development of an application that aids the university administrators to establish an efficient and effective system in managing faculty attendance and discourage unnecessary absences. The Faculty Attendance Management System (FAMS) is a web based and mobile application which is proven to be efficient and effective in handling and recording data, generating updated reports and analytics needed in managing faculty attendance. The FAMS can facilitate not only a convenient and faster way of gathering and recording of data but it can also provide data analytics, immediate feedback system mechanism and analysis. The software database architecture uses MySQL for web based and SQLite for mobile applications. The system includes different modules that capture daily attendance of faculty members, generate faculty attendance reports and analytics, absences notification system for faculty members, chairperson and dean regarding absences, and immediate communication system concerning the absences incurred. Quantitative and qualitative evaluation showed that the system satisfactory meet the stakeholder’s requirements. The functionality, usability, reliability, performance, and security all turned out to be above average. System testing, integration testing and user acceptance testing had been conducted. Results showed that the system performed very satisfactory and functions as designed. Performance of the system is also affected by Internet infrastructure or connectivity of the university. The faculty analytics generated from the system may not only be used by Deans and Chairperson in their evaluation of faculty performance but as well as the individual faculty to increase awareness on their attendance in class. Hence, the system facilitates effective communication between system stakeholders through FAMS feedback mechanism and up to date posting of information.Keywords: faculty attendance management system, MySQL, SQLite, FAMS, analytics
Procedia PDF Downloads 43644860 Experimental Exploration of Recycled Materials for Potential Application in Interior Design
Authors: E. P. Bhowmik, R. Singh
Abstract:
Certain materials casually thrown away as by-product household waste, such as used tea leaves, used coffee remnants, eggshells, peanut husks, coconut coir, unwanted paper, and pencil shavings- have scope in the hidden properties that they offer as recyclable raw ingredients. This paper aims to explore and experiment with the sustainable potential of such disposed wastes, obtained from domestic and commercial backgrounds, that could otherwise contribute to the field of interior design if mass-collected and repurposed. Research has been conducted on available recorded methods of mass-collection, storage, and processing of such materials by certain brands, designers, and researchers, as well as the various application and angles possible with regards to re-usage. A questionnaire survey was carried out to understand the willingness of the demographics for efforts of the mass collection and their openness to such unconventional materials for interiors. An experiment was also conducted where the selected waste ingredients were used to create small samples that could be used as decorative panels. Comparisons were made for properties like color, smell, texture, relative durability, and weight- and accordingly, applications were suggested. The experiment, therefore, helped to propose to recycle of the common household as a potential surface finish for floors, walls, and ceilings, and even founding material for furniture and decor accessories such as pottery and lamp shades; for non-structural application in both residential and commercial interiors. Common by-product wastes often see their ends at landfills- laymen unaware of their sustainable possibilities dispose of them. However, processing these waste materials and repurposing them by incorporating them into interiors would serve as a sustainable alternative to ethical dilemmas in the construction of interior design/architecture elements.Keywords: interior materials, mass-collection, sustainable, waste recycle
Procedia PDF Downloads 10444859 Controlled Shock Response Spectrum Test on Spacecraft Subsystem Using Electrodynamic Shaker
Authors: M. Madheswaran, A. R. Prashant, S. Ramakrishna, V. Ramesh Naidu, P. Govindan, P. Aravindakshan
Abstract:
Shock Response spectrum (SRS) tests are one of the tests that are conducted on some critical systems of spacecraft as part of environmental testing. The SRS tests are conducted to simulate the pyro shocks that occur during launch phases as well as during deployment of spacecraft appendages. Some of the methods to carryout SRS tests are pyro technique method, impact hammer method, drop shock method and using electro dynamic shakers. The pyro technique, impact hammer and drop shock methods are open loop tests, whereas SRS testing using electrodynamic shaker is a controlled closed loop test. SRS testing using electrodynamic shaker offers various advantages such as simple test set up, better controllability and repeatability. However, it is important to devise a a proper test methodology so that safety of the electro dynamic shaker and that of test specimen are not compromised. This paper discusses the challenges that are involved in conducting SRS tests, shaker validation and the necessary precautions to be considered. Approach involved in choosing various test parameters like synthesis waveform, spectrum convergence level, etc., are discussed. A case study of SRS test conducted on an optical payload of Indian Geo stationary spacecraft is presented.Keywords: maxi-max spectrum, SRS (shock response spectrum), SDOf (single degree of freedom), wavelet synthesis
Procedia PDF Downloads 35944858 Cancellation of Transducer Effects from Frequency Response Functions: Experimental Case Study on the Steel Plate
Authors: P. Zamani, A. Taleshi Anbouhi, M. R. Ashory, S. Mohajerzadeh, M. M. Khatibi
Abstract:
Modal analysis is a developing science in the experimental evaluation of dynamic properties of the structures. Mechanical devices such as accelerometers are one of the sources of lack of quality in measuring modal testing parameters. In this paper, eliminating the accelerometer’s mass effect of the frequency response of the structure is studied. So, a strategy is used for eliminating the mass effect by using sensitivity analysis. In this method, the amount of mass change and the place to measure the structure’s response with least error in frequency correction is chosen. Experimental modal testing is carried out on a steel plate and the effect of accelerometer’s mass is omitted using this strategy. Finally, a good agreement is achieved between numerical and experimental results.Keywords: accelerometer mass, frequency response function, modal analysis, sensitivity analysis
Procedia PDF Downloads 44644857 Testing the Impact of Landmarks on Navigation through the Use of Mobile-Based Games
Authors: Demet Yesiltepe, Ruth Dalton, Ayse Ozbil
Abstract:
The aim of this paper is to understand the effect of landmarks on spatial navigation. For this study, a mobile-based virtual game, 'Sea Hero Quest' (SHQ), was used. At the beginning of the game, participants were asked to look at maps which included the specific locations of players and checkpoints. After the map disappeared, participants were asked to navigate a boat and find the checkpoints in a pre-given order. By analyzing this data, we aim to better understand an important component of cities, namely landmarks, on spatial navigation. Game levels were analyzed spatially and axial-based integration, choice and connectivity values of levels were calculated to make comparisons. To make this kind of a comparison, we focused on levels which include both local and global landmarks and levels which include only local landmarks. The most significant contribution of this study to urban design and planning fields is that it provides mounting evidence about the utility of landmarks and their roles in cities due to the fact that the game was played more than 2.5 million people. Moreover, by using these results, it can be possible to encourage cities with more global and local landmarks to have more identifiable/readable areas.Keywords: landmarks, mobile-based games, spatial navigation, virtual environment
Procedia PDF Downloads 36844856 Application of Computational Flow Dynamics (CFD) Analysis for Surge Inception and Propagation for Low Head Hydropower Projects
Authors: M. Mohsin Munir, Taimoor Ahmad, Javed Munir, Usman Rashid
Abstract:
Determination of maximum elevation of a flowing fluid due to sudden rejection of load in a hydropower facility is of great interest to hydraulic engineers to ensure safety of the hydraulic structures. Several mathematical models exist that employ one-dimensional modeling for the determination of surge but none of these perfectly simulate real-time circumstances. The paper envisages investigation of surge inception and propagation for a Low Head Hydropower project using Computational Fluid Dynamics (CFD) analysis on FLOW-3D software package. The fluid dynamic model utilizes its analysis for surge by employing Reynolds’ Averaged Navier-Stokes Equations (RANSE). The CFD model is designed for a case study at Taunsa hydropower Project in Pakistan. Various scenarios have run through the model keeping in view upstream boundary conditions. The prototype results were then compared with the results of physical model testing for the same scenarios. The results of the numerical model proved quite accurate coherence with the physical model testing and offers insight into phenomenon which are not apparent in physical model and shall be adopted in future for the similar low head projects limiting delays and cost incurred in the physical model testing.Keywords: surge, FLOW-3D, numerical model, Taunsa, RANSE
Procedia PDF Downloads 35944855 Correlation Between Hydrogen Charging and Charpy Impact of 4340 Steel
Authors: J. Alcisto, M. Papakyriakou, J. Guerra, A. Dominguez, M. Miller, J. Foyos, E. Jones, N. Ula, M. Hahn, L. Zeng, Y. Li, O. S. Es-Said
Abstract:
Current methods of testing for hydrogen charging are slow and time consuming. The objective of this paper was to determine if hydrogen charging can be detected quantitatively through the use of Charpy Impact (CI) testing. CI is a much faster and simpler process than current methods for detecting hydrogen charging. Steel plates were Electro Discharge Machined (EDM) into ninety-six 4340 steel CI samples and forty-eight tensile bars. All the samples were heat treated at 900°C to austentite and then rapidly quenched in water to form martensite. The samples were tempered at eight different target strengths/target temperatures (145, 160, 170, 180, 190, 205, 220, to 250KSI, thousands of pounds per square inch)/(1100, 1013, 956, 898, 840, 754, 667, 494 degrees Celsius). After a tedious process of grinding and machining v-notches to the Charpy samples, they were divided into four groups. One group was kept as received baseline for comparison while the other three groups were sent to Alcoa (Fasteners) Inc. in Torrance to be cadmium coated. The three groups were coated with three thicknesses (2, 3 and 5 mils). That means that the samples were charged with ascending hydrogen levels. The samples were CI tested and tensile tested, and the data was tabulated and compared to the baseline group of uncharged samples of the same material. The results of this study were successful and indicated that CI testing was able to quantitatively detect hydrogen charging.Keywords: Charpy impact toughness, hydrogen charging, 4340 steel, Electro Discharge Machined (EDM)
Procedia PDF Downloads 29844854 AI-Driven Strategies for Sustainable Electronics Repair: A Case Study in Energy Efficiency
Authors: Badiy Elmabrouk, Abdelhamid Boujarif, Zhiguo Zeng, Stephane Borrel, Robert Heidsieck
Abstract:
In an era where sustainability is paramount, this paper introduces a machine learning-driven testing protocol to accurately predict diode failures, merging reliability engineering with failure physics to enhance repair operations efficiency. Our approach refines the burn-in process, significantly curtailing its duration, which not only conserves energy but also elevates productivity and mitigates component wear. A case study from GE HealthCare’s repair center vividly demonstrates the method’s effectiveness, recording a high prediction of diode failures and a substantial decrease in energy consumption that translates to an annual reduction of 6.5 Tons of CO2 emissions. This advancement sets a benchmark for environmentally conscious practices in the electronics repair sector.Keywords: maintenance, burn-in, failure physics, reliability testing
Procedia PDF Downloads 6844853 Data Recording for Remote Monitoring of Autonomous Vehicles
Authors: Rong-Terng Juang
Abstract:
Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar
Procedia PDF Downloads 16344852 Metropolis-Hastings Sampling Approach for High Dimensional Testing Methods of Autonomous Vehicles
Authors: Nacer Eddine Chelbi, Ayet Bagane, Annie Saleh, Claude Sauvageau, Denis Gingras
Abstract:
As recently stated by National Highway Traffic Safety Administration (NHTSA), to demonstrate the expected performance of a highly automated vehicles system, test approaches should include a combination of simulation, test track, and on-road testing. In this paper, we propose a new validation method for autonomous vehicles involving on-road tests (Field Operational Tests), test track (Test Matrix) and simulation (Worst Case Scenarios). We concentrate our discussion on the simulation aspects, in particular, we extend recent work based on Importance Sampling by using a Metropolis-Hasting algorithm (MHS) to sample collected data from the Safety Pilot Model Deployment (SPMD) in lane-change scenarios. Our proposed MH sampling method will be compared to the Importance Sampling method, which does not perform well in high-dimensional problems. The importance of this study is to obtain a sampler that could be applied to high dimensional simulation problems in order to reduce and optimize the number of test scenarios that are necessary for validation and certification of autonomous vehicles.Keywords: automated driving, autonomous emergency braking (AEB), autonomous vehicles, certification, evaluation, importance sampling, metropolis-hastings sampling, tests
Procedia PDF Downloads 288