Search results for: computer aided design and computer aided manufacturing (CAD/CAM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15481

Search results for: computer aided design and computer aided manufacturing (CAD/CAM)

14911 The Hypoglycemic Grab Back (HOGG): Preparing Hypo-Screen-Bags to Streamline the Time-Consuming Process of Administering Glucose Systemic Correction

Authors: Mai Ali

Abstract:

Background: Preparing Hypo-screen-bags in advance streamlines the time-consuming process of administering glucose systemic correction. Additionally, Hypo-Screen Grab Bags are widely adopted in UK hospitals. Aim: The aim of the study is to improve hypoglycemia screening efficiency and equipment accessibility by streamlining item access to grab bag restocking staff. Methodology: The study centered on neonatal wards at LGI & St. James Neonatal Unit and related units. A web-based survey was conducted to evaluate local practices, gathering 21 responses from relevant general staff. The survey outcomes: (1) The demand for accessible grab bags is evident for smoother processes. (2) The potential to enhance efficiency through improved preparation of hypo-screen grab bags. Intervention: A Hypo-Screen Grab Bag was designed, including checklists for stocked items and required samples. Medical staff oversee restocking after use. Conclusion: The study successfully improved hypoglycemia screening efficiency and aided junior staff with accessible supplies and a user-friendly checklist.

Keywords: neonatal hypoglycemia, grab bag, hypo-screening, junior staff

Procedia PDF Downloads 44
14910 Depressive Symptoms of U.S. Collegiate Athletes: Risk Factors and Implementations for Mental Health Well-Being for Athletes

Authors: David R. LaVetter, Justin B. Homatas, Claudia Benavides Espinoza

Abstract:

An increased awareness of depression rates among collegiate athletes has aided educational institutions to evaluate their mental health resources for athletes. This paper adds to our knowledge of this growing problem among collegiate athletes. National athletic associations and educational institutions are more knowledgeable of the mental health crisis facing hundreds of thousands of athletes each year, and some have implemented resources to improve mental health. However, college athletes continue to experience depressive symptoms at increasing rates. In this paper, depression rates for the vast numbers of collegiate athletes were found to be significantly greater than the general adult population. This paper used the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) method to examine the literature’s findings on depression rates among collegiate athletes. Particularly, this study answers questions related to risk factors of college athletes’ depressive symptoms. Risk factors unique to this population are also discussed. Prevalence rates by sport participant gender and sport are provided. Implementation measures in current practice at educational institutions in the U.S. are discussed to help alleviate depression rates among college athletes.

Keywords: college athletes, depression, risk factors, mental health

Procedia PDF Downloads 53
14909 Gender Differences in Adolescent Avatars: Gender Consistency and Masculinity-Femininity of Nicknames and Characters

Authors: Monika Paleczna, Małgorzata Holda

Abstract:

Choosing an avatar's gender in a computer game is one of the key elements in the process of creating an online identity. The selection of a male or female avatar can define the entirety of subsequent decisions regarding both appearance and behavior. However, when the most popular games available for the Nintendo console in 1998 were analyzed, it turned out that 41% of computer games did not have female characters. Nowadays, players create their avatars based mainly on binary gender classification, with male and female characters to choose from. The main aim of the poster is to explore gender differences in adolescent avatars. 130 adolescents aged 15-17 participated in the study. They created their avatars and then played a computer game. The creation of the avatar was based on the choice of gender, then physical and mental characteristics. Data on gender consistency (consistency between participant’s sex and gender selected for the avatar) and masculinity-femininity of avatar nicknames and appearance will be presented. The masculinity-femininity of avatar nicknames and appearance was assessed by expert raters on a very masculine to very feminine scale. Additionally, data on the relationships of the perceived levels of masculinity-femininity with hostility-friendliness and the intelligence of avatars will be shown. The dimensions of hostility-friendliness and intelligence were also assessed by expert raters on scales ranging from very hostile to very friendly and from very low intelligence to very high intelligence.

Keywords: gender, avatar, adolescence, computer games

Procedia PDF Downloads 201
14908 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences

Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi

Abstract:

Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.

Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort

Procedia PDF Downloads 197
14907 An Evaluation of Drivers in Implementing Sustainable Manufacturing in India: Using DEMATEL Approach

Authors: D. Garg, S. Luthra, A. Haleem

Abstract:

Due to growing concern about environmental and social consequences throughout the world, a need has been felt to incorporate sustainability concepts in conventional manufacturing. This paper is an attempt to identify and evaluate drivers in implementing sustainable manufacturing in Indian context. Nine possible drivers for successful implementation of sustainable manufacturing have been identified from extensive review. Further, Decision Making Trial and Evaluation Laboratory (DEMATEL) approach has been utilized to evaluate and categorize these identified drivers for implementing sustainable manufacturing in to the cause and effect groups. Five drivers (Societal Pressure and Public Concerns; Regulations and Government Policies; Top Management Involvement, Commitment and Support; Effective Strategies and Activities towards Socially Responsible Manufacturing and Market Trends) have been categorized into the cause group and four drivers (Holistic View in Manufacturing Systems; Supplier Participation; Building Sustainable culture in Organization; and Corporate Image and Benefits) have been categorized into the effect group. “Societal Pressure and Public Concerns” has been found the most critical driver and “Corporate Image and Benefits” as least critical or the most easily influenced driver to implementing sustainable manufacturing in Indian context. This paper may surely help practitioners in better understanding of these drivers and their priorities towards effective implementation of sustainable manufacturing.

Keywords: drivers, decision making trial and evaluation laboratory (DEMATEL), India, sustainable manufacturing

Procedia PDF Downloads 375
14906 Transformational Justice for Employees' Job Satisfaction

Authors: Hassan Barau Singhry

Abstract:

Purpose: Leadership or the absence of it is an important behaviour affecting employees’ job satisfaction. Although, there are many models of leadership, one that stands out in a period of change is the transformational behaviour. The aim of this study is to investigate the role of an organizational justice on the relationship between transformational leadership and employee job satisfaction. The study is based on the assumption that change begins with leaders and leaders should be fair and just. Methodology: A cross-sectional survey through structured questionnaire was employed to collect the data of this study. The population is selected the three tiers of government such as the local, state, and federal governments in Nigeria. The sampling method used in this research is stratified random sampling. 418 middle managers of public organizations respondents to the questionnaire. Multiple regression aided by structural equation modeling was employed to test 4 hypothesized relationships. Finding: The regression results support for the mediating role of organizational justice such as distributive, procedural, interpersonal and informational justice in the link between transformational leadership and job satisfaction. Originality/value: This study adds to the literature of human resource management by empirically validating and integrating transformational leadership behaviour with the four dimensions of organizational justice theory. The study is expected to be beneficial to the top and middle-level administrators as well as theory building and testing.

Keywords: distributive justice, job satisfaction, organizational justice, procedural justice, transformational leadership

Procedia PDF Downloads 155
14905 A Design for Customer Preferences Model by Cluster Analysis of Geometric Features and Customer Preferences

Authors: Yuan-Jye Tseng, Ching-Yen Chen

Abstract:

In the design cycle, a main design task is to determine the external shape of the product. The external shape of a product is one of the key factors that can affect the customers’ preferences linking to the motivation to buy the product, especially in the case of a consumer electronic product such as a mobile phone. The relationship between the external shape and the customer preferences needs to be studied to enhance the customer’s purchase desire and action. In this research, a design for customer preferences model is developed for investigating the relationships between the external shape and the customer preferences of a product. In the first stage, the names of the geometric features are collected and evaluated from the data of the specified internet web pages using the developed text miner. The key geometric features can be determined if the number of occurrence on the web pages is relatively high. For each key geometric feature, the numerical values are explored using the text miner to collect the internet data from the web pages. In the second stage, a cluster analysis model is developed to evaluate the numerical values of the key geometric features to divide the external shapes into several groups. Several design suggestion cases can be proposed, for example, large model, mid-size model, and mini model, for designing a mobile phone. A customer preference index is developed by evaluating the numerical data of each of the key geometric features of the design suggestion cases. The design suggestion case with the top ranking of the customer preference index can be selected as the final design of the product. In this paper, an example product of a notebook computer is illustrated. It shows that the external shape of a product can be used to drive customer preferences. The presented design for customer preferences model is useful for determining a suitable external shape of the product to increase customer preferences.

Keywords: cluster analysis, customer preferences, design evaluation, design for customer preferences, product design

Procedia PDF Downloads 174
14904 Advantages of Computer Navigation in Knee Arthroplasty

Authors: Mohammad Ali Al Qatawneh, Bespalchuk Pavel Ivanovich

Abstract:

Computer navigation has been introduced in total knee arthroplasty to improve the accuracy of the procedure. Computer navigation improves the accuracy of bone resection in the coronal and sagittal planes. It was also noted that it normalizes the rotational alignment of the femoral component and fully assesses and balances the deformation of soft tissues in the coronal plane. The work is devoted to the advantages of using computer navigation technology in total knee arthroplasty in 62 patients (11 men and 51 women) suffering from gonarthrosis, aged 51 to 83 years, operated using a computer navigation system, followed up to 3 years from the moment of surgery. During the examination, the deformity variant was determined, and radiometric parameters of the knee joints were measured using the Knee Society Score (KSS), Functional Knee Society Score (FKSS), and Western Ontario and McMaster University Osteoarthritis Index (WOMAC) scales. Also, functional stress tests were performed to assess the stability of the knee joint in the frontal plane and functional indicators of the range of motion. After surgery, improvement was observed in all scales; firstly, the WOMAC values decreased by 5.90 times, and the median value to 11 points (p < 0.001), secondly KSS increased by 3.91 times and reached 86 points (p < 0.001), and the third one is that FKSS data increased by 2.08 times and reached 94 points (p < 0.001). After TKA, the axis deviation of the lower limbs of more than 3 degrees was observed in 4 patients at 6.5% and frontal instability of the knee joint just in 2 cases at 3.2%., The lower incidence of sagittal instability of the knee joint after the operation was 9.6%. The range of motion increased by 1.25 times; the volume of movement averaged 125 degrees (p < 0.001). Computer navigation increases the accuracy of the spatial orientation of the endoprosthesis components in all planes, reduces the variability of the axis of the lower limbs within ± 3 °, allows you to achieve the best results of surgical interventions, and can be used to solve most basic tasks, allowing you to achieve excellent and good outcomes of operations in 100% of cases according to the WOMAC scale. With diaphyseal deformities of the femur and/or tibia, as well as with obstruction of their medullary canal, the use of computer navigation is the method of choice. The use of computer navigation prevents the occurrence of flexion contracture and hyperextension of the knee joint during the distal sawing of the femur. Using the navigation system achieves high-precision implantation for the endoprosthesis; in addition, it achieves an adequate balance of the ligaments, which contributes to the stability of the joint, reduces pain, and allows for the achievement of a good functional result of the treatment.

Keywords: knee joint, arthroplasty, computer navigation, advantages

Procedia PDF Downloads 75
14903 An Improved Data Aided Channel Estimation Technique Using Genetic Algorithm for Massive Multi-Input Multiple-Output

Authors: M. Kislu Noman, Syed Mohammed Shamsul Islam, Shahriar Hassan, Raihana Pervin

Abstract:

With the increasing rate of wireless devices and high bandwidth operations, wireless networking and communications are becoming over crowded. To cope with such crowdy and messy situation, massive MIMO is designed to work with hundreds of low costs serving antennas at a time as well as improve the spectral efficiency at the same time. TDD has been used for gaining beamforming which is a major part of massive MIMO, to gain its best improvement to transmit and receive pilot sequences. All the benefits are only possible if the channel state information or channel estimation is gained properly. The common methods to estimate channel matrix used so far is LS, MMSE and a linear version of MMSE also proposed in many research works. We have optimized these methods using genetic algorithm to minimize the mean squared error and finding the best channel matrix from existing algorithms with less computational complexity. Our simulation result has shown that the use of GA worked beautifully on existing algorithms in a Rayleigh slow fading channel and existence of Additive White Gaussian Noise. We found that the GA optimized LS is better than existing algorithms as GA provides optimal result in some few iterations in terms of MSE with respect to SNR and computational complexity.

Keywords: channel estimation, LMMSE, LS, MIMO, MMSE

Procedia PDF Downloads 182
14902 A Construct to Perform in Situ Deformation Measurement of Material Extrusion-Fabricated Structures

Authors: Daniel Nelson, Valeria La Saponara

Abstract:

Material extrusion is an additive manufacturing modality that continues to show great promise in the ability to create low-cost, highly intricate, and exceedingly useful structural elements. As more capable and versatile filament materials are devised, and the resolution of manufacturing systems continues to increase, the need to understand and predict manufacturing-induced warping will gain ever greater importance. The following study presents an in situ remote sensing and data analysis construct that allows for the in situ mapping and quantification of surface displacements induced by residual stresses on a specified test structure. This proof-of-concept experimental process shows that it is possible to provide designers and manufacturers with insight into the manufacturing parameters that lead to the manifestation of these deformations and a greater understanding of the behavior of these warping events over the course of the manufacturing process.

Keywords: additive manufacturing, deformation, digital image correlation, fused filament fabrication, residual stress, warping

Procedia PDF Downloads 70
14901 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 223
14900 Bridging the Digital Divide in India: Issus and Challenges

Authors: Parveen Kumar

Abstract:

The cope the rapid change of technology and to control the ephemeral rate of information generation, librarians along with their professional colleagues need to equip themselves as per the requirement of the electronic information society. E-learning is purely based on computer and communication technologies. The terminologies like computer based learning. It is the delivery of content via all electronic media through internet, internet, Extranets television broadcast, CD-Rom documents, etc. E-learning poses lot of issues in the transformation of literature or knowledge from the conventional medium to ICT based format and web based services.

Keywords: e-learning, digital libraries, online learning, electronic information society

Procedia PDF Downloads 497
14899 Meta Model for Optimum Design Objective Function of Steel Frames Subjected to Seismic Loads

Authors: Salah R. Al Zaidee, Ali S. Mahdi

Abstract:

Except for simple problems of statically determinate structures, optimum design problems in structural engineering have implicit objective functions where structural analysis and design are essential within each searching loop. With these implicit functions, the structural engineer is usually enforced to write his/her own computer code for analysis, design, and searching for optimum design among many feasible candidates and cannot take advantage of available software for structural analysis, design, and searching for the optimum solution. The meta-model is a regression model used to transform an implicit objective function into objective one and leads in turn to decouple the structural analysis and design processes from the optimum searching process. With the meta-model, well-known software for structural analysis and design can be used in sequence with optimum searching software. In this paper, the meta-model has been used to develop an explicit objective function for plane steel frames subjected to dead, live, and seismic forces. Frame topology is assumed as predefined based on architectural and functional requirements. Columns and beams sections and different connections details are the main design variables in this study. Columns and beams are grouped to reduce the number of design variables and to make the problem similar to that adopted in engineering practice. Data for the implicit objective function have been generated based on analysis and assessment for many design proposals with CSI SAP software. These data have been used later in SPSS software to develop a pure quadratic nonlinear regression model for the explicit objective function. Good correlations with a coefficient, R2, in the range from 0.88 to 0.99 have been noted between the original implicit functions and the corresponding explicit functions generated with meta-model.

Keywords: meta-modal, objective function, steel frames, seismic analysis, design

Procedia PDF Downloads 229
14898 The Influence of the Intellectual Capital on the Firms’ Market Value: A Study of Listed Firms in the Tehran Stock Exchange (TSE)

Authors: Bita Mashayekhi, Seyed Meisam Tabatabaie Nasab

Abstract:

Intellectual capital is one of the most valuable and important parts of the intangible assets of enterprises especially in knowledge-based enterprises. With respect to increasing gap between the market value and the book value of the companies, intellectual capital is one of the components that can be placed in this gap. This paper uses the value added efficiency of the three components, capital employed, human capital and structural capital, to measure the intellectual capital efficiency of Iranian industries groups, listed in the Tehran Stock Exchange (TSE), using a 8 years period data set from 2005 to 2012. In order to analyze the effect of intellectual capital on the market-to-book value ratio of the companies, the data set was divided into 10 industries, Banking, Pharmaceutical, Metals & Mineral Nonmetallic, Food, Computer, Building, Investments, Chemical, Cement and Automotive, and the panel data method was applied to estimating pooled OLS. The results exhibited that value added of capital employed has a positive significant relation with increasing market value in the industries, Banking, Metals & Mineral Nonmetallic, Food, Computer, Chemical and Cement, and also, showed that value added efficiency of structural capital has a positive significant relation with increasing market value in the Banking, Pharmaceutical and Computer industries groups. The results of the value added showed a negative relation with the Banking and Pharmaceutical industries groups and a positive relation with computer and Automotive industries groups. Among the studied industries, computer industry has placed the widest gap between the market value and book value in its intellectual capital.

Keywords: capital employed, human capital, intellectual capital, market-to-book value, structural capital, value added efficiency

Procedia PDF Downloads 367
14897 Bank Failures: A Question of Leadership

Authors: Alison L. Miles

Abstract:

Almost all major financial institutions in the world suffered losses due to the financial crisis of 2007, but the extent varied widely. The causes of the crash of 2007 are well documented and predominately focus on the role and complexity of the financial markets. The dominant theme of the literature suggests the causes of the crash were a combination of globalization, financial sector innovation, moribund regulation and short termism. While these arguments are undoubtedly true, they do not tell the whole story. A key weakness in the current analysis is the lack of consideration of those leading the banks pre and during times of crisis. This purpose of this study is to examine the possible link between the leadership styles and characteristics of the CEO, CFO and chairman and the financial institutions that failed or needed recapitalization. As such, it contributes to the literature and debate on international financial crises and systemic risk and also to the debate on risk management and regulatory reform in the banking sector. In order to first test the proposition (p1) that there are prevalent leadership characteristics or traits in financial institutions, an initial study was conducted using a sample of the top 65 largest global banks and financial institutions according to the Banker Top 1000 banks 2014. Secondary data from publically available and official documents, annual reports, treasury and parliamentary reports together with a selection of press articles and analyst meeting transcripts was collected longitudinally from the period 1998 to 2013. A computer aided key word search was used in order to identify the leadership styles and characteristics of the chairman, CEO and CFO. The results were then compared with the leadership models to form a picture of leadership in the sector during the research period. As this resulted in separate results that needed combining, SPSS data editor was used to aggregate the results across the studies using the variables ‘leadership style’ and ‘company financial performance’ together with the size of the company. In order to test the proposition (p2) that there was a prevalent leadership style in the banks that failed and the proposition (P3) that this was different to those that did not, further quantitative analysis was carried out on the leadership styles of the chair, CEO and CFO of banks that needed recapitalization, were taken over, or required government bail-out assistance during 2007-8. These included: Lehman Bros, Merrill Lynch, Royal Bank of Scotland, HBOS, Barclays, Northern Rock, Fortis and Allied Irish. The findings show that although regulatory reform has been a key mechanism of control of behavior in the banking sector, consideration of the leadership characteristics of those running the board are a key factor. They add weight to the argument that if each crisis is met with the same pattern of popular fury with the financier, increased regulation, followed by back to business as usual, the cycle of failure will always be repeated and show that through a different lens, new paradigms can be formed and future clashes avoided.

Keywords: banking, financial crisis, leadership, risk

Procedia PDF Downloads 309
14896 Control of Belts for Classification of Geometric Figures by Artificial Vision

Authors: Juan Sebastian Huertas Piedrahita, Jaime Arturo Lopez Duque, Eduardo Luis Perez Londoño, Julián S. Rodríguez

Abstract:

The process of generating computer vision is called artificial vision. The artificial vision is a branch of artificial intelligence that allows the obtaining, processing, and analysis of any type of information especially the ones obtained through digital images. Actually the artificial vision is used in manufacturing areas for quality control and production, as these processes can be realized through counting algorithms, positioning, and recognition of objects that can be measured by a single camera (or more). On the other hand, the companies use assembly lines formed by conveyor systems with actuators on them for moving pieces from one location to another in their production. These devices must be previously programmed for their good performance and must have a programmed logic routine. Nowadays the production is the main target of every industry, quality, and the fast elaboration of the different stages and processes in the chain of production of any product or service being offered. The principal base of this project is to program a computer that recognizes geometric figures (circle, square, and triangle) through a camera, each one with a different color and link it with a group of conveyor systems to organize the mentioned figures in cubicles, which differ from one another also by having different colors. This project bases on artificial vision, therefore the methodology needed to develop this project must be strict, this one is detailed below: 1. Methodology: 1.1 The software used in this project is QT Creator which is linked with Open CV libraries. Together, these tools perform to realize the respective program to identify colors and forms directly from the camera to the computer. 1.2 Imagery acquisition: To start using the libraries of Open CV is necessary to acquire images, which can be captured by a computer’s web camera or a different specialized camera. 1.3 The recognition of RGB colors is realized by code, crossing the matrices of the captured images and comparing pixels, identifying the primary colors which are red, green, and blue. 1.4 To detect forms it is necessary to realize the segmentation of the images, so the first step is converting the image from RGB to grayscale, to work with the dark tones of the image, then the image is binarized which means having the figure of the image in a white tone with a black background. Finally, we find the contours of the figure in the image to detect the quantity of edges to identify which figure it is. 1.5 After the color and figure have been identified, the program links with the conveyor systems, which through the actuators will classify the figures in their respective cubicles. Conclusions: The Open CV library is a useful tool for projects in which an interface between a computer and the environment is required since the camera obtains external characteristics and realizes any process. With the program for this project any type of assembly line can be optimized because images from the environment can be obtained and the process would be more accurate.

Keywords: artificial intelligence, artificial vision, binarized, grayscale, images, RGB

Procedia PDF Downloads 367
14895 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 116
14894 Exchange Rate Fluctuations and Economic Performance of Manufacturing Sector: Evidence from Nigeria

Authors: Ifeoma Patricia Osamor, Ayotunde Qudus Saka, Godwin Omoregbee, Hikmat Oreoluwalomo Omolaja

Abstract:

Persistent fall in the value of Nigeria's currency compared to other foreign currencies, constant fluctuations in the exchange rate, and an increase in the price of goods and services necessitated the examination of the effects of exchange rate fluctuations on the economic performance of the manufacturing sector in Nigeria. An ex-post facto research design was adopted. Manufacturing gross domestic product (MGDP) was proxied for performance; Naira/Dollar exchange rate (NDE), Naira/Pounds exchange rate (NPE), Foreign exchange supply (FES) were used for exchange rate fluctuations; and inflation rate (INF) was a control variable. Data were collected from CBN Statistical Bulletin (2020) also World Development Indicators of the World Bank, while data collected were analysed using descriptive analysis, unit root, bounds cointegration test, and ARDL. Findings showed that changes in Naira/Dollar exchange rate (NDE) and Naira/Pound Sterling exchange rate negatively but significantly impact the economic performance of the manufacturing sector, while foreign exchange supply leads to an insignificant positive effect on the economic performance of the manufacturing. The study concludes that exchange rate fluctuations negatively impact the performance of the manufacturing sector in Nigeria and, therefore, recommends that government should encourage export diversification through agriculture, agro-investment, and agro-allied industries that would boost export in order to improve the value of the Naira, thereby stabilizing the exchange rate.

Keywords: exchange rate, economic performance, gross domestic product, inflation rate, foreign exchange supply

Procedia PDF Downloads 180
14893 Optimization of High Flux Density Design for Permanent Magnet Motor

Authors: Dong-Woo Kang

Abstract:

This paper presents an optimal magnet shape of a spoke-shaped interior permanent magnet synchronous motor by using ferrite magnets. Generally, the permanent magnet motor used the ferrite magnets has lower output power and efficiency than a rare-earth magnet motor, because the ferrite magnet has lower magnetic energy than the rare-earth magnet. Nevertheless, the ferrite magnet motor is used to many industrial products owing to cost effectiveness. In this paper, the authors propose a high power density design of the ferrite permanent magnet synchronous motor. Furthermore, because the motor design has to be taken a manufacturing process into account, the design is simulated by using the finite element method for analyzing the demagnetization, the magnetizing, and the structure stiffness. Especially, the magnet shape and dimensions are decided for satisfying these properties. Finally, the authors design an optimal motor for applying our system. That final design is manufactured and evaluated from experimentations.

Keywords: demagnetization, design optimization, magnetic analysis, permanent magnet motors

Procedia PDF Downloads 365
14892 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 99
14891 Limestone Briquette Production and Characterization

Authors: André C. Silva, Mariana R. Barros, Elenice M. S. Silva, Douglas. Y. Marinho, Diego F. Lopes, Débora N. Sousa, Raphael S. Tomáz

Abstract:

Modern agriculture requires productivity, efficiency and quality. Therefore, there is need for agricultural limestone implementation that provides adequate amounts of calcium and magnesium carbonates in order to correct soil acidity. During the limestone process, fine particles (with average size under 400#) are generated. These particles do not have economic value in agricultural and metallurgical sectors due their size. When limestone is used for agriculture purposes, these fine particles can be easily transported by wind generated air pollution. Therefore, briquetting, a mineral processing technique, was used to mitigate this problem resulting in an agglomerated product suitable for agriculture use. Briquetting uses compressive pressure to agglomerate fine particles. It can be aided by agglutination agents, allowing adjustments in shape, size and mechanical parameters of the mass. Briquettes can generate extra profits for mineral industry, presenting as a distinct product for agriculture, and can reduce the environmental liabilities of the fine particles storage or disposition. The produced limestone briquettes were subjected to shatter and water action resistance tests. The results show that after six minutes completely submerged in water, the briquettes where fully diluted, a highly favorable result considering its use for soil acidity correction.

Keywords: agglomeration, briquetting, limestone, soil acidity correction

Procedia PDF Downloads 378
14890 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 424
14889 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology

Authors: Anjian Chen, Joseph C. Chen

Abstract:

This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.

Keywords: additive manufacturing, fused deposition modeling, surface roughness, six-sigma, Taguchi method, 3D printing

Procedia PDF Downloads 373
14888 An Analysis of the Effectiveness of Computer-Assisted Instruction on Student Achievement in Differing Science Content Areas

Authors: Edwin Christmann, John Hicks

Abstract:

This meta-analysis compared the mathematics achievement of students who received either traditional instruction or traditional instruction supplemented with computer-assisted instruction (CAI). From the 27 conclusions, an overall mean effect size of 0.236 was calculated, indicating that, on average, students receiving traditional instruction supplemented with CAI attained higher mathematics achievement than did 59.48 percent of those receiving traditional instruction per se.

Keywords: CAI, science, meta-analysis, traditional

Procedia PDF Downloads 160
14887 Binary Programming for Manufacturing Material and Manufacturing Process Selection Using Genetic Algorithms

Authors: Saleem Z. Ramadan

Abstract:

The material selection problem is concerned with the determination of the right material for a certain product to optimize certain performance indices in that product such as mass, energy density, and power-to-weight ratio. This paper is concerned about optimizing the selection of the manufacturing process along with the material used in the product under performance indices and availability constraints. In this paper, the material selection problem is formulated using binary programming and solved by genetic algorithm. The objective function of the model is to minimize the total manufacturing cost under performance indices and material and manufacturing process availability constraints.

Keywords: optimization, material selection, process selection, genetic algorithm

Procedia PDF Downloads 405
14886 Public Economic Efficiency and Case-Based Reasoning: A Theoretical Framework to Police Performance

Authors: Javier Parra-Domínguez, Juan Manuel Corchado

Abstract:

At present, public efficiency is a concept that intends to maximize return on public investment focus on minimizing the use of resources and maximizing the outputs. The concept takes into account statistical criteria drawn up according to techniques such as DEA (Data Envelopment Analysis). The purpose of the current work is to consider, more precisely, the theoretical application of CBR (Case-Based Reasoning) from economics and computer science, as a preliminary step to improving the efficiency of law enforcement agencies (public sector). With the aim of increasing the efficiency of the public sector, we have entered into a phase whose main objective is the implementation of new technologies. Our main conclusion is that the application of computer techniques, such as CBR, has become key to the efficiency of the public sector, which continues to require economic valuation based on methodologies such as DEA. As a theoretical result and conclusion, the incorporation of CBR systems will reduce the number of inputs and increase, theoretically, the number of outputs generated based on previous computer knowledge.

Keywords: case-based reasoning, knowledge, police, public efficiency

Procedia PDF Downloads 118
14885 Design for Filter and Transitions to Substrat Integated Waveguide at Ka Band

Authors: Damou Mehdi, Nouri Keltouma, Fahem Mohammed

Abstract:

In this paper, the concept of substrate integrated waveguide (SIW) technology is used to design filter for 30 GHz communication systems. SIW is created in the substrate of RT/Duroid 5880 having relative permittivity ε_r= 2.2 and loss tangent tanφ = 0.0009. Four Via are placed on the century filter the structures of SIW are modeled using and have been optimized in software HFSS (High Frequency Structure Simulator), à transition is designed for a Ka-band transceiver module with a 28.5GHz center frequency, . and then the results are verified using another simulation CST Microwave Studio (Computer Simulation Technology). The return loss are less than -18 dB, and -13 dB respectively. The insertion loss is divided equally -1.2 dB and -1.4 respectively.

Keywords: transition, microstrip, substrat integrated wave guide, filter, via

Procedia PDF Downloads 640
14884 Design and Development of Constant Stress Composite Cantilever Beam

Authors: Vinod B. Suryawanshi, Ajit D. Kelkar

Abstract:

Glass fiber reinforced composites materials, due their unique properties such as high mechanical strength to weight ratio, corrosion resistance, and impact resistance have huge potential as structural materials in automotive, construction and transportation applications. However, these properties often come at higher cost owing to complex design methods, difficult manufacturing processes and raw material cost. In this paper, a cost effective design and manufacturing approach for a composite cantilever beam structure is presented. A constant stress (variable cross section) beam concept has been used to design and optimize the shape of composite cantilever beam and thus obtain the reduction in material used. The variable cross section beam was fabricated from the glass epoxy prepregs using cost effective out of autoclave process. The drop ply technique has been successfully used to obtain the variation in the cross section along the span of the beam. In order to test the beam and validate the design, the beam was subjected to different end loads. Strain gauges were mounted along the length of the beam to obtain strains in the beam at different sections and loads. The strain values were used to calculate the flexural strength and bending stresses in the beam. The stresses obtained through strain measurements from the experiment were found to be uniform along the span of the beam, and thus validates the design. Finally, the finite element model for the constant stress beam was developed using commercial finite element simulation software. It was observed that the simulation results agreed very well with the experimental results.

Keywords: beams, composites, constant cross-section, structures

Procedia PDF Downloads 335
14883 Deep Learning Approach to Trademark Design Code Identification

Authors: Girish J. Showkatramani, Arthi M. Krishna, Sashi Nareddi, Naresh Nula, Aaron Pepe, Glen Brown, Greg Gabel, Chris Doninger

Abstract:

Trademark examination and approval is a complex process that involves analysis and review of the design components of the marks such as the visual representation as well as the textual data associated with marks such as marks' description. Currently, the process of identifying marks with similar visual representation is done manually in United States Patent and Trademark Office (USPTO) and takes a considerable amount of time. Moreover, the accuracy of these searches depends heavily on the experts determining the trademark design codes used to catalog the visual design codes in the mark. In this study, we explore several methods to automate trademark design code classification. Based on recent successes of convolutional neural networks in image classification, we have used several different convolutional neural networks such as Google’s Inception v3, Inception-ResNet-v2, and Xception net. The study also looks into other techniques to augment the results from CNNs such as using Open Source Computer Vision Library (OpenCV) to pre-process the images. This paper reports the results of the various models trained on year of annotated trademark images.

Keywords: trademark design code, convolutional neural networks, trademark image classification, trademark image search, Inception-ResNet-v2

Procedia PDF Downloads 218
14882 Expert System: Debugging Using MD5 Process Firewall

Authors: C. U. Om Kumar, S. Kishore, A. Geetha

Abstract:

An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.

Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization

Procedia PDF Downloads 410