Search results for: statistical tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8650

Search results for: statistical tool

8440 Feasibility Study of Friction Stir Welding Application for Kevlar Material

Authors: Ahmet Taşan, Süha Tirkeş, Yavuz Öztürk, Zafer Bingül

Abstract:

Friction stir welding (FSW) is a joining process in the solid state, which eliminates problems associated with the material melting and solidification, such as cracks, residual stresses and distortions generated during conventional welding. Among the most important advantages of FSW are; easy automation, less distortion, lower residual stress and good mechanical properties in the joining region. FSW is a recent approach to metal joining and although originally intended for aluminum alloys, it is investigated in a variety of metallic materials. The basic concept of FSW is a rotating tool, made of non-consumable material, specially designed with a geometry consisting of a pin and a recess (shoulder). This tool is inserted as spinning on its axis at the adjoining edges of two sheets or plates to be joined and then it travels along the joining path line. The tool rotation axis defines an angle of inclination with which the components to be welded. This angle is used for receiving the material to be processed at the tool base and to promote the gradual forge effect imposed by the shoulder during the passage of the tool. This prevents the material plastic flow at the tool lateral, ensuring weld closure on the back of the pin. In this study, two 4 mm Kevlar® plates which were produced with the Kevlar® fabrics, are analyzed with COMSOL Multiphysics in order to investigate the weldability via FSW. Thereafter, some experimental investigation is done with an appropriate workbench in order to compare them with the analysis results.

Keywords: analytical modeling, composite materials welding, friction stir welding, heat generation

Procedia PDF Downloads 159
8439 Ontology as Knowledge Capture Tool in Organizations: A Literature Review

Authors: Maria Margaretha, Dana Indra Sensuse, Lukman

Abstract:

Knowledge capture is a step in knowledge life cycle to get knowledge in the organization. Tacit and explicit knowledge are needed to organize in a path, so the organization will be easy to choose which knowledge will be use. There are many challenges to capture knowledge in the organization, such as researcher must know which knowledge has been validated by an expert, how to get tacit knowledge from experts and make it explicit knowledge, and so on. Besides that, the technology will be a reliable tool to help the researcher to capture knowledge. Some paper wrote how ontology in knowledge management can be used for proposed framework to capture and reuse knowledge. Organization has to manage their knowledge, process capture and share will decide their position in the business area. This paper will describe further from literature review about the tool of ontology that will help the organization to capture its knowledge.

Keywords: knowledge capture, ontology, technology, organization

Procedia PDF Downloads 606
8438 Employer Learning, Statistical Discrimination and University Prestige

Authors: Paola Bordon, Breno Braga

Abstract:

This paper investigates whether firms use university prestige to statistically discriminate among college graduates. The test is based on the employer learning literature which suggests that if firms use a characteristic for statistical discrimination, this variable should become less important for earnings as a worker gains labor market experience. In this framework, we use a regression discontinuity design to estimate a 19% wage premium for recent graduates of two of the most selective universities in Chile. However, we find that this premium decreases by 3 percentage points per year of labor market experience. These results suggest that employers use college selectivity as a signal of workers' quality when they leave school. However, as workers reveal their productivity throughout their careers, they become rewarded based on their true quality rather than the prestige of their college.

Keywords: employer learning, statistical discrimination, college returns, college selectivity

Procedia PDF Downloads 581
8437 Preparation vADL.net: A Software Architecture Tool with Support to All of Architectural Concepts Title

Authors: Adel Smeda, Badr Najep

Abstract:

Software architecture is a method of describing the architecture of a software system at a high level of abstraction. It represents a common abstraction of a system that stakeholders can use as a basis for mutual understanding, negotiation, consensus, and communication. It also manifests the earliest design decisions about a system, and these early bindings carry weight far out of proportion to their individual gravity with respect to the system's remaining development, its deployment, and its maintenance life, therefore it is the earliest point at which design decisions governing the system to be built can be analyzed. In this paper, we present a tool to model the architecture of software systems. It represents the first method by which system defects can be detected, and provide a clear representation of a system’s components and interactions at a high level of abstraction. It can be distinguished from other tools by its support to all software architecture elements. The tool is built using VB.net 2010. We used this tool to describe two well know systems, i.e. Capitalize and Client/Server, and the descriptions we obtained support all architectural elements of the two systems.

Keywords: software architecture, architecture description languages, modeling

Procedia PDF Downloads 467
8436 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 98
8435 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms

Authors: Nebi Gedik

Abstract:

One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).

Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram

Procedia PDF Downloads 223
8434 The Metacognition Levels of Students: A Research School of Physical Education and Sports at Anadolu University

Authors: Dilek Yalız Solmaz

Abstract:

Meta-cognition is an important factor for educating conscious individuals who are aware of their cognitive processes. With this respect, the purposes of this article is to find out the perceived metacognition level of Physical Education and Sports School students at Anadolu University and to identify whether metacognition levels display significant differences in terms of various variables. 416 Anadolu University Physical Education and Sports School students were formed the research universe. "The Meta-Cognitions Questionnaire (MCQ-30)" developed by Cartwright-Hatton and Wells and later developed the 30-item short form (MCQ-30) was used. The MCQ-30 which was adapted into Turkish by Tosun and Irak is a four-point agreement scale. In the data analysis, arithmethic mean, standard deviation, t-test and ANOVA were used. There is no statistical difference between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence and the positive beliefs of girls and boys students. There is a statistical difference between mean scores of the need to control thinking. There is no statistical difference according to departments of students between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence, need to control thinking and the positive beliefs. There is no statistical difference according to grade level of students between mean scores of the positive beliefs, cognitive confidence and need to control thinking. There is a statistical difference between mean scores of uncontrollableness and danger and cognitive awareness.

Keywords: meta cognition, physical education, sports school students, thinking

Procedia PDF Downloads 383
8433 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 260
8432 Direct Translation vs. Pivot Language Translation for Persian-Spanish Low-Resourced Statistical Machine Translation System

Authors: Benyamin Ahmadnia, Javier Serrano

Abstract:

In this paper we compare two different approaches for translating from Persian to Spanish, as a language pair with scarce parallel corpus. The first approach involves direct transfer using an statistical machine translation system, which is available for this language pair. The second approach involves translation through English, as a pivot language, which has more translation resources and more advanced translation systems available. The results show that, it is possible to achieve better translation quality using English as a pivot language in either approach outperforms direct translation from Persian to Spanish. Our best result is the pivot system which scores higher than direct translation by (1.12) BLEU points.

Keywords: statistical machine translation, direct translation approach, pivot language translation approach, parallel corpus

Procedia PDF Downloads 488
8431 Poetry as Valuable Tool for Tackling Climate Change and Environmental Pollution

Authors: Benjamin Anabaraonye

Abstract:

Our environment is our entitlement, and it is our duty to guard it for the safety of our society. It is, therefore, in our best interest to explore the necessary tools required to tackle the issues of environmental pollution which are major causes of climate change. Poetry has been discovered through our study as a valuable tool for tackling climate change and environmental pollution. This study explores the science of poetry and how important it is for scientists and engineers to develop their creativity to obtain relevant skills needed to tackle these global challenges. Poetry has been discovered as a great tool for climate change education which in turn brings about climate change adaptation and mitigation. This paper is, therefore, a clarion and urgent call for us to rise to our responsibility for a sustainable future.

Keywords: climate change, education, environment, poetry

Procedia PDF Downloads 206
8430 Theology of Science and Technology as a Tool for Peace Education

Authors: Jonas Chikelue Ogbuefi

Abstract:

Science and Technology have a major impact on societal peace, it offers support to teaching and learning, cuts costs, and offers solutions to the current agitations and militancy in Nigeria today. Christianity, for instance, did not only change and form the western world in the past 2022 but still has a substantial role to play in society through liquid ecclesiology. This paper interrogated the impact of the theology of Science and Technology as a tool for peace sustainability through peace education in Nigeria. The method adopted is a historical and descriptive method of analysis. It was discovered that a larger number of Nigerian citizens lack almost all the basic things needed for the standard of living, such as Shelter, meaningful employment, and clothing, which is the root course of all agitations in Nigeria. Based on the above findings, the paper contends that the government alone cannot restore Peace in Nigeria. Hence the inability of the government to restore peace calls for all religious actors to be involved. The main thrust and recommendation of this paper are to challenge the religious actors to implement the Theology of Science and Technology as a tool for peace restoration and should network with both the government and the private sectors to make funds available to budding and existing entrepreneurs using Science and Technology as a tool for Peace and economic sustainability. This paper viewed the theology of Science and Technology as a tool for Peace and economic sustainability in Nigeria.

Keywords: theology, science, technology, peace education

Procedia PDF Downloads 84
8429 Proposal of a Damage Inspection Tool After Earthquakes: Case of Algerian Buildings

Authors: Akkouche Karim, Nekmouche Aghiles, Bouzid Leyla

Abstract:

This study focuses on the development of a multifunctional Expert System (ES) called post-seismic damage inspection tool (PSDIT), a powerful tool which allows the evaluation, the processing and the archiving of the collected data stock after earthquakes. PSDIT can be operated by two user types; an ordinary user (engineer, expert or architect) for the damage visual inspection and an administrative user for updating the knowledge and / or for adding or removing the ordinary user. The knowledge acquisition is driven by a hierarchical knowledge model, the Information from investigation reports and those acquired through feedback from expert / engineer questionnaires are part.

Keywords: buildings, earthquake, seismic damage, damage assessment, expert system

Procedia PDF Downloads 88
8428 Analysis of the Engineering Judgement Influence on the Selection of Geotechnical Parameters Characteristic Values

Authors: K. Ivandic, F. Dodigovic, D. Stuhec, S. Strelec

Abstract:

A characteristic value of certain geotechnical parameter results from an engineering assessment. Its selection has to be based on technical principles and standards of engineering practice. It has been shown that the results of engineering assessment of different authors for the same problem and input data are significantly dispersed. A survey was conducted in which participants had to estimate the force that causes a 10 cm displacement at the top of a axially in-situ compressed pile. Fifty experts from all over the world took part in it. The lowest estimated force value was 42% and the highest was 133% of measured force resulting from a mentioned static pile load test. These extreme values result in significantly different technical solutions to the same engineering task. In case of selecting a characteristic value of a geotechnical parameter the importance of the influence of an engineering assessment can be reduced by using statistical methods. An informative annex of Eurocode 1 prescribes the method of selecting the characteristic values of material properties. This is followed by Eurocode 7 with certain specificities linked to selecting characteristic values of geotechnical parameters. The paper shows the procedure of selecting characteristic values of a geotechnical parameter by using a statistical method with different initial conditions. The aim of the paper is to quantify an engineering assessment in the example of determining a characteristic value of a specific geotechnical parameter. It is assumed that this assessment is a random variable and that its statistical features will be determined. For this purpose, a survey research was conducted among relevant experts from the field of geotechnical engineering. Conclusively, the results of the survey and the application of statistical method were compared.

Keywords: characteristic values, engineering judgement, Eurocode 7, statistical methods

Procedia PDF Downloads 296
8427 Tool Wear Monitoring of High Speed Milling Based on Vibratory Signal Processing

Authors: Hadjadj Abdechafik, Kious Mecheri, Ameur Aissa

Abstract:

The objective of this study is to develop a process of treatment of the vibratory signals generated during a horizontal high speed milling process without applying any coolant in order to establish a monitoring system able to improve the machining performance. Thus, many tests were carried out on the horizontal high speed centre (PCI Météor 10), in given cutting conditions, by using a milling cutter with only one insert and measured its frontal wear from its new state that is considered as a reference state until a worn state that is considered as unsuitable for the tool to be used. The results obtained show that the first harmonic follow well the evolution of frontal wear, on another hand a wavelet transform is used for signal processing and is found to be useful for observing the evolution of the wavelet approximations through the cutting tool life. The power and the Root Mean Square (RMS) values of the wavelet transformed signal gave the best results and can be used for tool wear estimation. All this features can constitute the suitable indicators for an effective detection of tool wear and then used for the input parameters of an online monitoring system. Although we noted the remarkable influence of the machining cycle on the quality of measurements by the introduction of a bias on the signal, this phenomenon appears in particular in horizontal milling and in the majority of studies is ignored.

Keywords: flank wear, vibration, milling, signal processing, monitoring

Procedia PDF Downloads 598
8426 Evaluation of Green Logistics Performance: An Application of Analytic Hierarchy Process Method for Ranking Environmental Indicators

Authors: Eduarda Dutra De Souza, Gabriela Hammes, Marina Bouzon, Carlos M. Taboada Rodriguez

Abstract:

The search for minimizing harmful impacts on the environment has become the focus of global society, affecting mainly how to manage organizations. Thus, companies have sought to transform their activities into environmentally friendly initiatives by applying green practices throughout their supply chains. In the logistics domain, the implementation of environmentally sound practices is still in its infancy in emerging countries such as Brazil. Given the need to reduce these environmental damages, this study aims to evaluate the performance of green logistics (GL) in the plastics industry sector in order to help to improve environmental performance within organizations and reduce the impact caused by their activities. The performance tool was based on theoretical research and the use of experts in the field. The Analytic Hierarchy Process (AHP) was used to prioritize green practices and assign weight to the indicators contained in the proposed tool. The tool also allows the co-production of a single indicator. The developed tool was applied in an industry of the plastic packaging sector. However, this tool may be applied in different industry sectors, and it is adaptable to different sizes of companies. Besides the contributions to the literature, this work also presents future paths of research in the field of green logistics.

Keywords: AHP, green logistics, green supply chain, performance evaluation

Procedia PDF Downloads 160
8425 Use of Focus Group Interviews to Design a Health Impact Measurement Tool: A Volunteering Case Study

Authors: Valentine Seymour

Abstract:

Environmental volunteering organisations use questionnaires to explore the relationship between environmental volunteers and their health. To the author’s best knowledge, no one has explored volunteers’ health perception, which could be considered when designing a health impact measurement tool used to increase effective communication. This paper examines environmental volunteers' perceptions of health, knowledge which can be used to design a health impact measurement tool. This study uses focus group interviews, content analysis, and a general inductive approach to explore the health perceptions of volunteers who engage in environmental volunteering activities from the perspective of UK charity The Conservation Volunteers. Findings showed that volunteer groups presented were relatively similar in how they defined the term health, with their overall conceptual model closely resembling that of the World Health Organization 1948 definition. This suggests that future health impact measurement tools in the environmental volunteering sector could base their design around the World Health Organization’s definition.

Keywords: health perception, impact measurement, mental models, tool development

Procedia PDF Downloads 154
8424 Crashworthiness Optimization of an Automotive Front Bumper in Composite Material

Authors: S. Boria

Abstract:

In the last years, the crashworthiness of an automotive body structure can be improved, since the beginning of the design stage, thanks to the development of specific optimization tools. It is well known how the finite element codes can help the designer to investigate the crashing performance of structures under dynamic impact. Therefore, by coupling nonlinear mathematical programming procedure and statistical techniques with FE simulations, it is possible to optimize the design with reduced number of analytical evaluations. In engineering applications, many optimization methods which are based on statistical techniques and utilize estimated models, called meta-models, are quickly spreading. A meta-model is an approximation of a detailed simulation model based on a dataset of input, identified by the design of experiments (DOE); the number of simulations needed to build it depends on the number of variables. Among the various types of meta-modeling techniques, Kriging method seems to be excellent in accuracy, robustness and efficiency compared to other ones when applied to crashworthiness optimization. Therefore the application of such meta-model was used in this work, in order to improve the structural optimization of a bumper for a racing car in composite material subjected to frontal impact. The specific energy absorption represents the objective function to maximize and the geometrical parameters subjected to some design constraints are the design variables. LS-DYNA codes were interfaced with LS-OPT tool in order to find the optimized solution, through the use of a domain reduction strategy. With the use of the Kriging meta-model the crashworthiness characteristic of the composite bumper was improved.

Keywords: composite material, crashworthiness, finite element analysis, optimization

Procedia PDF Downloads 256
8423 Experimental Study and Neural Network Modeling in Prediction of Surface Roughness on Dry Turning Using Two Different Cutting Tool Nose Radii

Authors: Deba Kumar Sarma, Sanjib Kr. Rajbongshi

Abstract:

Surface finish is an important product quality in machining. At first, experiments were carried out to investigate the effect of the cutting tool nose radius (considering 1mm and 0.65mm) in prediction of surface finish with process parameters of cutting speed, feed and depth of cut. For all possible cutting conditions, full factorial design was considered as two levels four parameters. Commercial Mild Steel bar and High Speed Steel (HSS) material were considered as work-piece and cutting tool material respectively. In order to obtain functional relationship between process parameters and surface roughness, neural network was used which was found to be capable for the prediction of surface roughness within a reasonable degree of accuracy. It was observed that tool nose radius of 1mm provides better surface finish in comparison to 0.65 mm. Also, it was observed that feed rate has a significant influence on surface finish.

Keywords: full factorial design, neural network, nose radius, surface finish

Procedia PDF Downloads 368
8422 Sensor Validation Using Bottleneck Neural Network and Variable Reconstruction

Authors: Somia Bouzid, Messaoud Ramdani

Abstract:

The success of any diagnosis strategy critically depends on the sensors measuring process variables. This paper presents a detection and diagnosis sensor faults method based on a Bottleneck Neural Network (BNN). The BNN approach is used as a statistical process control tool for drinking water distribution (DWD) systems to detect and isolate the sensor faults. Variable reconstruction approach is very useful for sensor fault isolation, this method is validated in simulation on a nonlinear system: actual drinking water distribution system. Several results are presented.

Keywords: fault detection, localization, PCA, NLPCA, auto-associative neural network

Procedia PDF Downloads 389
8421 A Comparative Analysis of a Custom Optimization Experiment with Confidence Intervals in Anylogic and Optquest

Authors: Felipe Haro, Soheila Antar

Abstract:

This paper introduces a custom optimization experiment developed in AnyLogic, based on genetic algorithms, designed to ensure reliable optimization results by incorporating Montecarlo simulations and achieving a specified confidence level. To validate the custom experiment, we compared its performance with AnyLogic's built-in OptQuest optimization method across three distinct problems. Statistical analyses, including Welch's t-test, were conducted to assess the differences in performance. The results demonstrate that while the custom experiment shows advantages in certain scenarios, both methods perform comparably in others, confirming the custom approach as a reliable and effective tool for optimization under uncertainty.

Keywords: optimization, confidence intervals, Montecarlo simulation, optQuest, AnyLogic

Procedia PDF Downloads 20
8420 Radar Signal Detection Using Neural Networks in Log-Normal Clutter for Multiple Targets Situations

Authors: Boudemagh Naime

Abstract:

Automatic radar detection requires some methods of adapting to variations in the background clutter in order to control their false alarm rate. The problem becomes more complicated in non-Gaussian environment. In fact, the conventional approach in real time applications requires a complex statistical modeling and much computational operations. To overcome these constraints, we propose another approach based on artificial neural network (ANN-CMLD-CFAR) using a Back Propagation (BP) training algorithm. The considered environment follows a log-normal distribution in the presence of multiple Rayleigh-targets. To evaluate the performances of the considered detector, several situations, such as scale parameter and the number of interferes targets, have been investigated. The simulation results show that the ANN-CMLD-CFAR processor outperforms the conventional statistical one.

Keywords: radat detection, ANN-CMLD-CFAR, log-normal clutter, statistical modelling

Procedia PDF Downloads 365
8419 Enhancing Single Channel Minimum Quantity Lubrication through Bypass Controlled Design for Deep Hole Drilling with Small Diameter Tool

Authors: Yongrong Li, Ralf Domroes

Abstract:

Due to significant energy savings, enablement of higher machining speed as well as environmentally friendly features, Minimum Quantity Lubrication (MQL) has been used for many machining processes efficiently. However, in the deep hole drilling field (small tool diameter D < 5 mm) and long tool (length L > 25xD) it is always a bottle neck for a single channel MQL system. The single channel MQL, based on the Venturi principle, faces a lack of enough oil quantity caused by dropped pressure difference during the deep hole drilling process. In this paper, a system concept based on a bypass design has explored its possibility to dynamically reach the required pressure difference between the air inlet and the inside of aerosol generator, so that the deep hole drilling demanded volume of oil can be generated and delivered to tool tips. The system concept has been investigated in static and dynamic laboratory testing. In the static test, the oil volume with and without bypass control were measured. This shows an oil quantity increasing potential up to 1000%. A spray pattern test has demonstrated the differences of aerosol particle size, aerosol distribution and reaction time between single channel and bypass controlled single channel MQL systems. A dynamic trial machining test of deep hole drilling (drill tool D=4.5mm, L= 40xD) has been carried out with the proposed system on a difficult machining material AlSi7Mg. The tool wear along a 100 meter drilling was tracked and analyzed. The result shows that the single channel MQL with a bypass control can overcome the limitation and enhance deep hole drilling with a small tool. The optimized combination of inlet air pressure and bypass control results in a high quality oil delivery to tool tips with a uniform and continuous aerosol flow.

Keywords: deep hole drilling, green production, Minimum Quantity Lubrication (MQL), near dry machining

Procedia PDF Downloads 206
8418 Statistical Randomness Testing of Some Second Round Candidate Algorithms of CAESAR Competition

Authors: Fatih Sulak, Betül A. Özdemir, Beyza Bozdemir

Abstract:

In order to improve symmetric key research, several competitions had been arranged by organizations like National Institute of Standards and Technology (NIST) and International Association for Cryptologic Research (IACR). In recent years, the importance of authenticated encryption has rapidly increased because of the necessity of simultaneously enabling integrity, confidentiality and authenticity. Therefore, at January 2013, IACR announced the Competition for Authenticated Encryption: Security, Applicability, and Robustness (CAESAR Competition) which will select secure and efficient algorithms for authenticated encryption. Cryptographic algorithms are anticipated to behave like random mappings; hence, it is important to apply statistical randomness tests to the outputs of the algorithms. In this work, the statistical randomness tests in the NIST Test Suite and the other recently designed randomness tests are applied to six second round algorithms of the CAESAR Competition. It is observed that AEGIS achieves randomness after 3 rounds, Ascon permutation function achieves randomness after 1 round, Joltik encryption function achieves randomness after 9 rounds, Morus state update function achieves randomness after 3 rounds, Pi-cipher achieves randomness after 1 round, and Tiaoxin achieves randomness after 1 round.

Keywords: authenticated encryption, CAESAR competition, NIST test suite, statistical randomness tests

Procedia PDF Downloads 316
8417 Experimental Investigation on Over-Cut in Ultrasonic Machining of WC-Co Composite

Authors: Ravinder Kataria, Jatinder Kumar, B. S. Pabla

Abstract:

Ultrasonic machining is one of the most widely used non-traditional machining processes for machining of materials that are relatively brittle, hard, and fragile such as advanced ceramics, refractories, crystals, quartz etc. Present article has been targeted at investigating the impact of different experimental conditions (power rating, cobalt content, tool material, thickness of work piece, tool geometry, and abrasive grit size) on over cut in ultrasonic drilling of WC-Co composite material. Taguchi’s L-36 orthogonal array has been employed for conducting the experiments. Significant factors have been identified using analysis of variance (ANOVA) test. The experimental results revealed that abrasive grit size and tool material are most significant factors for over cut.

Keywords: ANOVA, abrasive grit size, Taguchi, WC-Co, ultrasonic machining

Procedia PDF Downloads 398
8416 Statistical Characteristics of Distribution of Radiation-Induced Defects under Random Generation

Authors: P. Selyshchev

Abstract:

We consider fluctuations of defects density taking into account their interaction. Stochastic field of displacement generation rate gives random defect distribution. We determinate statistical characteristics (mean and dispersion) of random field of point defect distribution as function of defect generation parameters, temperature and properties of irradiated crystal.

Keywords: irradiation, primary defects, interaction, fluctuations

Procedia PDF Downloads 343
8415 Advances in Artificial intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance

Procedia PDF Downloads 478
8414 Self-Determination among Individuals with Intellectual Disability: An Experiment

Authors: Wasim Ahmad, Bir Singh Chavan, Nazli Ahmad

Abstract:

Objectives: The present investigation is an attempt to find out the efficacy of training the special educators on promoting self-determination among individuals with intellectual disability. Methods: The study equipped the special educators with necessary skills and knowledge to train individuals with the intellectual disability for practicing self-determination. Subjects: Special educators (N=25) were selected for training on self-determination among individuals with intellectual disability. After receiving the training, (N=50) individuals with an intellectual disability were selected and intervened by the trained special educators. Tool: Self-Determination Scale for Adults with Mild Mental Retardation (SDSAMR) developed by Keshwal and Thressiakutty (2010) has been used. It’s a reliable and valid tool used by many researchers. It has 36 items distributed in five domains namely: personal management, community participation, recreation and leisure time, choice making and problem solving. Analysis: The collected data was analyzed using the statistical techniques such as t-test, ANCOVA, and Posthoc Tuckey test. Results: The findings of the study reveal that there is a significant difference at 1% level in the pre and post tests mean scores (t-15.56) of self-determination concepts among the special educators. This indicates that the training enhanced the performance of special educators on the concept of self-determination among individuals with intellectual disability. The study also reveals that the training received on transition planning by the special educators found to be effective because they were able to practice the concept by imparting and training the individuals with intellectual disability to if determined. The results show that there was a significant difference at 1% level in the pre and post tests mean scores (t-16.61) of self-determination among individuals with intellectual disability. Conclusion: To conclude it can be said that the training has a remarkable impact on the performance of the individuals with intellectual disability on self-determination.

Keywords: experiment, individuals with intellectual disability, self-determination, special educators

Procedia PDF Downloads 335
8413 A Tool for Assessing Performance and Structural Quality of Business Process

Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri

Abstract:

Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.

Keywords: performance, structural quality, perspectives, tool, classification framework, measures

Procedia PDF Downloads 157
8412 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing

Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall

Abstract:

Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.

Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear

Procedia PDF Downloads 299
8411 The Impact of Innovations in Human Resource Practices, Innovation Capabilities and Competitive Advantage on Company Performance

Authors: Bita Kharazi

Abstract:

The purpose of this research was to investigate the impact of innovations in human resource practices, innovation capabilities, and competitive advantage on company performance. This research was applied in terms of purpose and in terms of method, it was descriptive research of correlation type. The statistical population of this research was all the employees of Zar Industrial and Research Group. The sampling method was available in this research, and Cochran's formula was used to determine the statistical sample size. A standard questionnaire was used to collect information in this research, and SPSS software and simultaneous regression statistical tests were used to analyze the data. Based on the findings of the present research, it was found that the components of creativity in human resource practices, innovation capability, and competitive advantage have a significant impact on the company's performance.

Keywords: human resource management, innovation, competitive advantage, company performance

Procedia PDF Downloads 23