Search results for: precision manufacturing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2797

Search results for: precision manufacturing

1297 Use of Real Time Ultrasound for the Prediction of Carcass Composition in Serrana Goats

Authors: Antonio Monteiro, Jorge Azevedo, Severiano Silva, Alfredo Teixeira

Abstract:

The objective of this study was to compare the carcass and in vivo real-time ultrasound measurements (RTU) and their capacity to predict the composition of Serrana goats up to 40% of maturity. Twenty one females (11.1 ± 3.97 kg) and Twenty one males (15.6 ± 5.38 kg) were utilized to made in vivo measurements with a 5 MHz probe (ALOKA 500V scanner) at the 9th-10th, 10th-11th thoracic vertebrae (uT910 and uT1011, respectively), at the 1st- 2nd, 3rd-4th, and 4th-5th lumbar vertebrae (uL12, ul34 and uL45, respectively) and also at the 3rd-4th sternebrae (EEST). It was recorded the images of RTU measurements of Longissimus thoracis et lumborum muscle (LTL) depth (EM), width (LM), perimeter (PM), area (AM) and subcutaneous fat thickness (SFD) above the LTL, as well as the depth of tissues of the sternum (EEST) between the 3rd-4th sternebrae. All RTU images were analyzed using the ImageJ software. After slaughter, the carcasses were stored at 4 ºC for 24 h. After this period the carcasses were divided and the left half was entirely dissected into muscle, dissected fat (subcutaneous fat plus intermuscular fat) and bone. Prior to the dissection measurements equivalent to those obtained in vivo with RTU were recorded. Using the Statistica 5, correlation and regression analyses were performed. The prediction of carcass composition was achieved by stepwise regression procedure, with live weight and RTU measurements with and without transformation of variables to the same dimension. The RTU and carcass measurements, except for SFD measurements, showed high correlation (r > 0.60, P < 0.001). The RTU measurements and the live weight, showed ability to predict carcass composition on muscle (R2 = 0.99, P < 0.001), subcutaneous fat (R2 = 0.41, P < 0.001), intermuscular fat (R2 = 0.84, P < 0.001), dissected fat (R2 = 0.71, P < 0.001) and bone (R2 = 0.94, P < 0.001). The transformation of variables allowed a slight increase of precision, but with the increase in the number of variables, with the exception of subcutaneous fat prediction. In vivo measurements by RTU can be applied to predict kid goat carcass composition, from 5 measurements of RTU and the live weight.

Keywords: carcass, goats, real time, ultrasound

Procedia PDF Downloads 250
1296 Comparison of Tensile Strength and Folding Endurance of (FDM Process) 3D Printed ABS and PLA Materials

Authors: R. Devicharan

Abstract:

In a short span 3D Printing is expected to play a vital role in our life. The possibility of creativity and speed in manufacturing through various 3D printing processes is infinite. This study is performed on the FDM (Fused Deposition Modelling) method of 3D printing, which is one of the pre-dominant methods of 3D printing technologies. This study focuses on physical properties of the objects produced by 3D printing which determine the applications of the 3D printed objects. This paper specifically aims at the study of the tensile strength and the folding endurance of the 3D printed objects through the FDM (Fused Deposition Modelling) method using the ABS (Acronitirile Butadiene Styrene) and PLA (Poly Lactic Acid) plastic materials. The study is performed on a controlled environment and the specific machine settings. Appropriate tables, graphs are plotted and research analysis techniques will be utilized to analyse, verify and validate the experiment results.

Keywords: FDM process, 3D printing, ABS for 3D printing, PLA for 3D printing, rapid prototyping

Procedia PDF Downloads 592
1295 A Compact Extended Laser Diode Cavity Centered at 780 nm for Use in High-Resolution Laser Spectroscopy

Authors: J. Alvarez, J. Pimienta, R. Sarmiento

Abstract:

Diode lasers working in free mode present different shifting and broadening determined by external factors such as temperature, current or mechanical vibrations, and they are not more useful in applications such as spectroscopy, metrology, and cooling of atoms, among others. Different configurations can reduce the spectral width of a laser; one of the most effective is to extend the optical resonator of the laser diode and use optical feedback either with the help of a partially reflective mirror or with a diffraction grating; this latter configuration is not only allowed to reduce the spectral width of the laser line but also to coarsely adjust its working wavelength, within a wide range typically ~ 10nm by slightly varying the angle of the diffraction grating. Two settings are commonly used for this purpose, the Littrow configuration and the Littmann Metcalf. In this paper, we present the design, construction, and characterization of a compact extended laser cavity in Littrow configuration. The designed cavity is compact and was machined on an aluminum block using computer numerical control (CNC); it has a mass of only 380 g. The design was tested on laser diodes with different wavelengths, 650nm, 780nm, and 795 nm, but can be equally efficient at other wavelengths. This report details the results obtained from the extended cavity working at a wavelength of 780 nm, with an output power of around 35mW and a line width of less than 1Mhz. The cavity was used to observe the spectrum of the corresponding Rubidium D2 line. By modulating the current and with the help of phase detection techniques, a dispersion signal with an excellent signal-to-noise ratio was generated that allowed the stabilization of the laser to a transition of the hyperfine structure of Rubidium with an integral proportional controller (PI) circuit made with precision operational amplifiers.

Keywords: Littrow, Littman-Metcalf, line width, laser stabilization, hyperfine structure

Procedia PDF Downloads 217
1294 Immature Palm Tree Detection Using Morphological Filter for Palm Counting with High Resolution Satellite Image

Authors: Nur Nadhirah Rusyda Rosnan, Nursuhaili Najwa Masrol, Nurul Fatiha MD Nor, Mohammad Zafrullah Mohammad Salim, Sim Choon Cheak

Abstract:

Accurate inventories of oil palm planted areas are crucial for plantation management as this would impact the overall economy and production of oil. One of the technological advancements in the oil palm industry is semi-automated palm counting, which is replacing conventional manual palm counting via digitizing aerial imagery. Most of the semi-automated palm counting method that has been developed was limited to mature palms due to their ideal canopy size represented by satellite image. Therefore, immature palms were often left out since the size of the canopy is barely visible from satellite images. In this paper, an approach using a morphological filter and high-resolution satellite image is proposed to detect immature palm trees. This approach makes it possible to count the number of immature oil palm trees. The method begins with an erosion filter with an appropriate window size of 3m onto the high-resolution satellite image. The eroded image was further segmented using watershed segmentation to delineate immature palm tree regions. Then, local minimum detection was used because it is hypothesized that immature oil palm trees are located at the local minimum within an oil palm field setting in a grayscale image. The detection points generated from the local minimum are displaced to the center of the immature oil palm region and thinned. Only one detection point is left that represents a tree. The performance of the proposed method was evaluated on three subsets with slopes ranging from 0 to 20° and different planting designs, i.e., straight and terrace. The proposed method was able to achieve up to more than 90% accuracy when compared with the ground truth, with an overall F-measure score of up to 0.91.

Keywords: immature palm count, oil palm, precision agriculture, remote sensing

Procedia PDF Downloads 65
1293 Machine Learning Techniques for COVID-19 Detection: A Comparative Analysis

Authors: Abeer A. Aljohani

Abstract:

COVID-19 virus spread has been one of the extreme pandemics across the globe. It is also referred to as coronavirus, which is a contagious disease that continuously mutates into numerous variants. Currently, the B.1.1.529 variant labeled as omicron is detected in South Africa. The huge spread of COVID-19 disease has affected several lives and has surged exceptional pressure on the healthcare systems worldwide. Also, everyday life and the global economy have been at stake. This research aims to predict COVID-19 disease in its initial stage to reduce the death count. Machine learning (ML) is nowadays used in almost every area. Numerous COVID-19 cases have produced a huge burden on the hospitals as well as health workers. To reduce this burden, this paper predicts COVID-19 disease is based on the symptoms and medical history of the patient. This research presents a unique architecture for COVID-19 detection using ML techniques integrated with feature dimensionality reduction. This paper uses a standard UCI dataset for predicting COVID-19 disease. This dataset comprises symptoms of 5434 patients. This paper also compares several supervised ML techniques to the presented architecture. The architecture has also utilized 10-fold cross validation process for generalization and the principal component analysis (PCA) technique for feature reduction. Standard parameters are used to evaluate the proposed architecture including F1-Score, precision, accuracy, recall, receiver operating characteristic (ROC), and area under curve (AUC). The results depict that decision tree, random forest, and neural networks outperform all other state-of-the-art ML techniques. This achieved result can help effectively in identifying COVID-19 infection cases.

Keywords: supervised machine learning, COVID-19 prediction, healthcare analytics, random forest, neural network

Procedia PDF Downloads 85
1292 A Sustainability Benchmarking Framework Based on the Life Cycle Sustainability Assessment: The Case of the Italian Ceramic District

Authors: A. M. Ferrari, L. Volpi, M. Pini, C. Siligardi, F. E. Garcia Muina, D. Settembre Blundo

Abstract:

A long tradition in the ceramic manufacturing since the 18th century, primarily due to the availability of raw materials and an efficient transport system, let to the birth and development of the Italian ceramic tiles district that nowadays represents a reference point for this sector even at global level. This economic growth has been coupled to attention towards environmental sustainability issues throughout various initiatives undertaken over the years at the level of the production sector, such as certification activities and sustainability policies. In this way, starting from an evaluation of the sustainability in all its aspects, the present work aims to develop a benchmarking helping both producers and consumers. In the present study, throughout the Life Cycle Sustainability Assessment (LCSA) framework, the sustainability has been assessed in all its dimensions: environmental with the Life Cycle Assessment (LCA), economic with the Life Cycle Costing (LCC) and social with the Social Life Cycle Assessment (S-LCA). The annual district production of stoneware tiles during the 2016 reference year has been taken as reference flow for all the three assessments, and the system boundaries cover the entire life cycle of the tiles, except for the LCC for which only the production costs have been considered at the moment. In addition, a preliminary method for the evaluation of local and indoor emissions has been introduced in order to assess the impact due to atmospheric emissions on both people living in the area surrounding the factories and workers. The Life Cycle Assessment results, obtained from IMPACT 2002+ modified assessment method, highlight that the manufacturing process is responsible for the main impact, especially because of atmospheric emissions at a local scale, followed by the distribution to end users, the installation and the ordinary maintenance of the tiles. With regard to the economic evaluation, both the internal and external costs have been considered. For the LCC, primary data from the analysis of the financial statements of Italian ceramic companies show that the higher cost items refer to expenses for goods and services and costs of human resources. The analysis of externalities with the EPS 2015dx method attributes the main damages to the distribution and installation of the tiles. The social dimension has been investigated with a preliminary approach by using the Social Hotspots Database, and the results indicate that the most affected damage categories are health and safety and labor rights and decent work. This study shows the potential of the LCSA framework applied to an industrial sector; in particular, it can be a useful tool for building a comprehensive benchmark for the sustainability of the ceramic industry, and it can help companies to actively integrate sustainability principles into their business models.

Keywords: benchmarking, Italian ceramic industry, life cycle sustainability assessment, porcelain stoneware tiles

Procedia PDF Downloads 116
1291 Automatic Measurement of Garment Sizes Using Deep Learning

Authors: Maulik Parmar, Sumeet Sandhu

Abstract:

The online fashion industry experiences high product return rates. Many returns are because of size/fit mismatches -the size scale on labels can vary across brands, the size parameters may not capture all fit measurements, or the product may have manufacturing defects. Warehouse quality check of garment sizes can be semi-automated to improve speed and accuracy. This paper presents an approach for automatically measuring garment sizes from a single image of the garment -using Deep Learning to learn garment keypoints. The paper focuses on the waist size measurement of jeans and can be easily extended to other garment types and measurements. Experimental results show that this approach can greatly improve the speed and accuracy of today’s manual measurement process.

Keywords: convolutional neural networks, deep learning, distortion, garment measurements, image warping, keypoints

Procedia PDF Downloads 289
1290 A Numerical Hybrid Finite Element Model for Lattice Structures Using 3D/Beam Elements

Authors: Ahmadali Tahmasebimoradi, Chetra Mang, Xavier Lorang

Abstract:

Thanks to the additive manufacturing process, lattice structures are replacing the traditional structures in aeronautical and automobile industries. In order to evaluate the mechanical response of the lattice structures, one has to resort to numerical techniques. Ansys is a globally well-known and trusted commercial software that allows us to model the lattice structures and analyze their mechanical responses using either solid or beam elements. In this software, a script may be used to systematically generate the lattice structures for any size. On the one hand, solid elements allow us to correctly model the contact between the substrates (the supports of the lattice structure) and the lattice structure, the local plasticity, and the junctions of the microbeams. However, their computational cost increases rapidly with the size of the lattice structure. On the other hand, although beam elements reduce the computational cost drastically, it doesn’t correctly model the contact between the lattice structures and the substrates nor the junctions of the microbeams. Also, the notion of local plasticity is not valid anymore. Moreover, the deformed shape of the lattice structure doesn’t correspond to the deformed shape of the lattice structure using 3D solid elements. In this work, motivated by the pros and cons of the 3D and beam models, a numerically hybrid model is presented for the lattice structures to reduce the computational cost of the simulations while avoiding the aforementioned drawbacks of the beam elements. This approach consists of the utilization of solid elements for the junctions and beam elements for the microbeams connecting the corresponding junctions to each other. When the global response of the structure is linear, the results from the hybrid models are in good agreement with the ones from the 3D models for body-centered cubic with z-struts (BCCZ) and body-centered cubic without z-struts (BCC) lattice structures. However, the hybrid models have difficulty to converge when the effect of large deformation and local plasticity are considerable in the BCCZ structures. Furthermore, the effect of the junction’s size of the hybrid models on the results is investigated. For BCCZ lattice structures, the results are not affected by the junction’s size. This is also valid for BCC lattice structures as long as the ratio of the junction’s size to the diameter of the microbeams is greater than 2. The hybrid model can take into account the geometric defects. As a demonstration, the point clouds of two lattice structures are parametrized in a platform called LATANA (LATtice ANAlysis) developed by IRT-SystemX. In this process, for each microbeam of the lattice structures, an ellipse is fitted to capture the effect of shape variation and roughness. Each ellipse is represented by three parameters; semi-major axis, semi-minor axis, and angle of rotation. Having the parameters of the ellipses, the lattice structures are constructed in Spaceclaim (ANSYS) using the geometrical hybrid approach. The results show a negligible discrepancy between the hybrid and 3D models, while the computational cost of the hybrid model is lower than the computational cost of the 3D model.

Keywords: additive manufacturing, Ansys, geometric defects, hybrid finite element model, lattice structure

Procedia PDF Downloads 107
1289 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing

Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero

Abstract:

Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.

Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming

Procedia PDF Downloads 136
1288 Importance of Knowledge in the Interdisciplinary Production Processes of Innovative Medical Tools

Authors: Katarzyna Mleczko

Abstract:

Processes of production of innovative medical tools have interdisciplinary character. They consist of direct and indirect close cooperation of specialists of different scientific branches. The Knowledge they have seems to be important for undertaken design, construction and manufacturing processes. The Knowledge exchange between participants of these processes is therefore crucial for the final result, which are innovative medical products. The paper draws attention to the necessity of feedback from the end user to the designer / manufacturer of medical tools which will allow for more accurate understanding of user needs. The study describes prerequisites of production processes of innovative medical (surgical) tools including participants and category of knowledge resources occurring in these processes. They are the result of research in selected Polish organizations involved in the production of medical instruments and are the basis for further work on the development of knowledge sharing model in interdisciplinary teams geographically dispersed.

Keywords: interdisciplinary production processes, knowledge exchange, knowledge sharing, medical tools

Procedia PDF Downloads 435
1287 Food Insecurity Assessment, Consumption Pattern and Implications of Integrated Food Security Phase Classification: Evidence from Sudan

Authors: Ahmed A. A. Fadol, Guangji Tong, Wlaa Mohamed

Abstract:

This paper provides a comprehensive analysis of food insecurity in Sudan, focusing on consumption patterns and their implications, employing the Integrated Food Security Phase Classification (IPC) assessment framework. Years of conflict and economic instability have driven large segments of the population in Sudan into crisis levels of acute food insecurity according to the (IPC). A substantial number of people are estimated to currently face emergency conditions, with an additional sizeable portion categorized under less severe but still extreme hunger levels. In this study, we explore the multifaceted nature of food insecurity in Sudan, considering its historical, political, economic, and social dimensions. An analysis of consumption patterns and trends was conducted, taking into account cultural influences, dietary shifts, and demographic changes. Furthermore, we employ logistic regression and random forest analysis to identify significant independent variables influencing food security status in Sudan. Random forest clearly outperforms logistic regression in terms of area under curve (AUC), accuracy, precision and recall. Forward projections of the IPC for Sudan estimate that 15 million individuals are anticipated to face Crisis level (IPC Phase 3) or worse acute food insecurity conditions between October 2023 and February 2024. Of this, 60% are concentrated in Greater Darfur, Greater Kordofan, and Khartoum State, with Greater Darfur alone representing 29% of this total. These findings emphasize the urgent need for both short-term humanitarian aid and long-term strategies to address Sudan's deepening food insecurity crisis.

Keywords: food insecurity, consumption patterns, logistic regression, random forest analysis

Procedia PDF Downloads 54
1286 Digital Twins in the Built Environment: A Systematic Literature Review

Authors: Bagireanu Astrid, Bros-Williamson Julio, Duncheva Mila, Currie John

Abstract:

Digital Twins (DT) are an innovative concept of cyber-physical integration of data between an asset and its virtual replica. They have originated in established industries such as manufacturing and aviation and have garnered increasing attention as a potentially transformative technology within the built environment. With the potential to support decision-making, real-time simulations, forecasting abilities and managing operations, DT do not fall under a singular scope. This makes defining and leveraging the potential uses of DT a potential missed opportunity. Despite its recognised potential in established industries, literature on DT in the built environment remains limited. Inadequate attention has been given to the implementation of DT in construction projects, as opposed to its operational stage applications. Additionally, the absence of a standardised definition has resulted in inconsistent interpretations of DT in both industry and academia. There is a need to consolidate research to foster a unified understanding of the DT. Such consolidation is indispensable to ensure that future research is undertaken with a solid foundation. This paper aims to present a comprehensive systematic literature review on the role of DT in the built environment. To accomplish this objective, a review and thematic analysis was conducted, encompassing relevant papers from the last five years. The identified papers are categorised based on their specific areas of focus, and the content of these papers was translated into a through classification of DT. In characterising DT and the associated data processes identified, this systematic literature review has identified 6 DT opportunities specifically relevant to the built environment: Facilitating collaborative procurement methods, Supporting net-zero and decarbonization goals, Supporting Modern Methods of Construction (MMC) and off-site manufacturing (OSM), Providing increased transparency and stakeholders collaboration, Supporting complex decision making (real-time simulations and forecasting abilities) and Seamless integration with Internet of Things (IoT), data analytics and other DT. Finally, a discussion of each area of research is provided. A table of definitions of DT across the reviewed literature is provided, seeking to delineate the current state of DT implementation in the built environment context. Gaps in knowledge are identified, as well as research challenges and opportunities for further advancements in the implementation of DT within the built environment. This paper critically assesses the existing literature to identify the potential of DT applications, aiming to harness the transformative capabilities of data in the built environment. By fostering a unified comprehension of DT, this paper contributes to advancing the effective adoption and utilisation of this technology, accelerating progress towards the realisation of smart cities, decarbonisation, and other envisioned roles for DT in the construction domain.

Keywords: built environment, design, digital twins, literature review

Procedia PDF Downloads 67
1285 Emerging Therapeutic Approach with Dandelion Phytochemicals in Breast Cancer Treatment

Authors: Angel Champion, Sadia Kanwal, Rafat Siddiqui

Abstract:

Harnessing phytochemicals from plant sources presents a novel opportunity to prevent or treat malignant diseases, including breast cancer. Chemotherapy lacks precision in targeting cancerous cells while sparing normal cells, but a phytopharmaceutical approach may offer a solution. Dandelion, a common weed plant, is rich in phytochemicals and provides a safer, more cost-effective alternative with lower toxicity than traditional pharmaceuticals for conditions such as breast cancer. In this study, an in-vitro experiment will be conducted using the ethanol extract of Dandelion on triple-negative MDA-231 breast cancer cell lines. The polyphenolic analysis revealed that the Dandelion extract, particularly from the root and leaf (both cut and sifted), had the most potent antioxidant properties and exhibited the most potent antioxidation activity from the powdered leaf extract. The extract exhibits prospective promising effects for inducing cell proliferation and apoptosis in breast cancer cells, highlighting its potential for targeted therapeutic interventions. Standardizing methods for Dandelion use is crucial for future clinical applications in cancer treatment. Combining plant-derived compounds with cancer nanotechnology holds the potential for effective strategies in battling malignant diseases. Utilizing liposomes as carriers for phytoconstituent anti-cancer agents offers improved solubility, bioavailability, immunoregulatory effects, advancing anticancer immune function, and reducing toxicity. This integrated approach of natural products and nanotechnology has significant potential to revolutionize healthcare globally, especially in underserved communities where herbal medicine is prevalent.

Keywords: apoptosis, antioxidant activity, cancer nanotechnology, phytopharmaceutical

Procedia PDF Downloads 46
1284 The Impact of Bim Technology on the Whole Process Cost Management of Civil Engineering Projects in Kenya

Authors: Nsimbe Allan

Abstract:

The study examines the impact of Building Information Modeling (BIM) on the cost management of engineering projects, focusing specifically on the Mombasa Port Area Development Project. The objective of this research venture is to determine the mechanisms through which Building Information Modeling (BIM) facilitates stakeholder collaboration, reduces construction-related expenses, and enhances the precision of cost estimation. Furthermore, the study investigates barriers to execution, assesses the impact on the project's transparency, and suggests approaches to maximize resource utilization. The study, selected for its practical significance and intricate nature, conducted a Systematic Literature Review (SLR) using credible databases, including ScienceDirect and IEEE Xplore. To constitute the diverse sample, 69 individuals, including project managers, cost estimators, and BIM administrators, were selected via stratified random sampling. The data were obtained using a mixed-methods approach, which prioritized ethical considerations. SPSS and Microsoft Excel were applied to the analysis. The research emphasizes the crucial role that project managers, architects, and engineers play in the decision-making process (47% of respondents). Furthermore, a significant improvement in cost estimation accuracy was reported by 70% of the participants. It was found that the implementation of BIM resulted in enhanced project visibility, which in turn optimized resource allocation and facilitated the process of budgeting. In brief, the study highlights the positive impacts of Building Information Modeling (BIM) on collaborative decision-making and cost estimation, addresses challenges related to implementation, and provides solutions for the efficient assimilation and understanding of BIM principles.

Keywords: cost management, resource utilization, stakeholder collaboration, project transparency

Procedia PDF Downloads 53
1283 Quality versus Excellence: The Importance of Employees Knowing the Difference

Authors: Chris Nelson

Abstract:

Quality and excellence are qualitative topics that are usually addressed based on knowledge and past experience from leadership and those in charge of the organization. The significance of this study is to highlight the differences and similarities between these two mindsets and how an operational staff can most appropriately use them in the workplace. Quality and excellence are two words that are talked about a lot in the manufacturing world. Buzzwords such as operational excellence, quality controls, and efficiencies are discussed in the boardroom as well on the shop floor. These terms are used quite frequently and with good reasons. When a person visits their favorite local restaurant, They go because 1) they like the food and 2) the people are some of the greatest individuals to be around. With that in mind, they know they always put out quality food. They do not always go because the quality of the food is far superior than other restaurants. But the quality of ingredients always meets their expectations. When they compare them to the term excellence, they are disappointed. The food never looks like the pictures on the menu. But when have you ever been to a restaurant where the food looks the same as on the menu? For them, when evaluating which buzzword to use as a guiding star, its simple: excellence. The corporation can accomplish these goals by operating at a standard that far exceeds customer’s wants and needs.

Keywords: industrial engineering, innovation, management and technology, logistics and scheduling, six sigma

Procedia PDF Downloads 194
1282 Assistive Kitchenware Design for Hemiparetics

Authors: Daniel F. Madrinan-Chiquito

Abstract:

Hemiparesis affects about eight out of ten stroke survivors, causing weakness or the inability to move one side of the body. One-sided weakness can affect arms, hands, legs, or facial muscles. People with one-sided weakness may have trouble performing everyday activities such as eating, cooking, dressing, and using the bathroom. Rehabilitation treatments, exercises at home, and assistive devices can help with mobility and recovery. Historically, such treatments and devices were developed within the fields of medicine and biomedical engineering. However, innovators outside of the traditional medical device community, such as Industrial Designers, have recently brought their knowledge and expertise to assistive technologies. Primary and secondary research was done in three parts. The primary research collected data by talking with several occupational therapists currently attending to stroke patients, and surveys were given to patients with hemiparesis and hemiplegia. The secondary research collected data through observation and testing of products currently marketed for single-handed people. Modern kitchenware available in the market for people with an acquired brain injury has deficiencies in both aesthetic and functional values. Object design for people with hemiparesis or hemiplegia has not been meaningfully explored. Most cookware is designed for use with two hands and possesses little room for adaptation to the needs of one-handed individuals. This project focuses on the design and development of two kitchenware devices. These devices assist hemiparetics with different cooking-related tasks such as holding, grasping, cutting, slicing, chopping, grating, and other essential activities. These intentionally designed objects will improve the quality of life of hemiparetics by enabling greater independence and providing an enhanced ability for precision tasks in a cooking environment.

Keywords: assistive technologies, hemiparetics, industrial design, kitchenware

Procedia PDF Downloads 98
1281 Processes for Valorization of Valuable Products from Kerf Slurry Waste

Authors: Nadjib Drouiche, Abdenour Lami, Salaheddine Aoudj, Tarik Ouslimane

Abstract:

Although solar cells manufacturing is a conservative industry, economics drivers continue to encourage innovation, feedstock savings and cost reduction. Kerf slurry waste is a complex product containing both valuable substances as well as contaminants. The valuable substances are: i) high purity silicon, ii) polyethylene glycol, and iii) silicon carbide. The contaminants mainly include metal fragments and organics. Therefore, recycling of the kerf slurry waste is an important subject not only from the treatment of waste but also from the recovery of valuable products. The present paper relates to processes for the recovery of valuable products from the kerf slurry waste in which they are contained, such products comprising nanoparticles, polyethylene glycol, high purity silicon, and silicon carbide.

Keywords: photovoltaic cell, Kerf slurry waste, recycling, silicon carbide

Procedia PDF Downloads 321
1280 The Impact of Cognitive Load on Deceit Detection and Memory Recall in Children’s Interviews: A Meta-Analysis

Authors: Sevilay Çankaya

Abstract:

The detection of deception in children’s interviews is essential for statement veracity. The widely used method for deception detection is building cognitive load, which is the logic of the cognitive interview (CI), and its effectiveness for adults is approved. This meta-analysis delves into the effectiveness of inducing cognitive load as a means of enhancing veracity detection during interviews with children. Additionally, the effectiveness of cognitive load on children's total number of events recalled is assessed as a second part of the analysis. The current meta-analysis includes ten effect sizes from search using databases. For the effect size calculation, Hedge’s g was used with a random effect model by using CMA version 2. Heterogeneity analysis was conducted to detect potential moderators. The overall result indicated that cognitive load had no significant effect on veracity outcomes (g =0.052, 95% CI [-.006,1.25]). However, a high level of heterogeneity was found (I² = 92%). Age, participants’ characteristics, interview setting, and characteristics of the interviewer were coded as possible moderators to explain variance. Age was significant moderator (β = .021; p = .03, R2 = 75%) but the analysis did not reveal statistically significant effects for other potential moderators: participants’ characteristics (Q = 0.106, df = 1, p = .744), interview setting (Q = 2.04, df = 1, p = .154), and characteristics of interviewer (Q = 2.96, df = 1, p = .086). For the second outcome, the total number of events recalled, the overall effect was significant (g =4.121, 95% CI [2.256,5.985]). The cognitive load was effective in total recalled events when interviewing with children. All in all, while age plays a crucial role in determining the impact of cognitive load on veracity, the surrounding context, interviewer attributes, and inherent participant traits may not significantly alter the relationship. These findings throw light on the need for more focused, age-specific methods when using cognitive load measures. It may be possible to improve the precision and dependability of deceit detection in children's interviews with the help of more studies in this field.

Keywords: deceit detection, cognitive load, memory recall, children interviews, meta-analysis

Procedia PDF Downloads 48
1279 An Automatic Feature Extraction Technique for 2D Punch Shapes

Authors: Awais Ahmad Khan, Emad Abouel Nasr, H. M. A. Hussein, Abdulrahman Al-Ahmari

Abstract:

Sheet-metal parts have been widely applied in electronics, communication and mechanical industries in recent decades; but the advancement in sheet-metal part design and manufacturing is still behind in comparison with the increasing importance of sheet-metal parts in modern industry. This paper presents a methodology for automatic extraction of some common 2D internal sheet metal features. The features used in this study are taken from Unipunch ™ catalogue. The extraction process starts with the data extraction from STEP file using an object oriented approach and with the application of suitable algorithms and rules, all features contained in the catalogue are automatically extracted. Since the extracted features include geometry and engineering information, they will be effective for downstream application such as feature rebuilding and process planning.

Keywords: feature extraction, internal features, punch shapes, sheet metal

Procedia PDF Downloads 605
1278 Low-Cost Space-Based Geoengineering: An Assessment Based on Self-Replicating Manufacturing of in-Situ Resources on the Moon

Authors: Alex Ellery

Abstract:

Geoengineering approaches to climate change mitigation are unpopular and regarded with suspicion. Of these, space-based approaches are regarded as unworkable and enormously costly. Here, a space-based approach is presented that is modest in cost, fully controllable and reversible, and acts as a natural spur to the development of solar power satellites over the longer term as a clean source of energy. The low-cost approach exploits self-replication technology which it is proposed may be enabled by 3D printing technology. Self-replication of 3D printing platforms will enable mass production of simple spacecraft units. Key elements being developed are 3D-printable electric motors and 3D-printable vacuum tube-based electronics. The power of such technologies will open up enormous possibilities at low cost including space-based geoengineering.

Keywords: 3D printing, in-situ resource utilization, self-replication technology, space-based geoengineering

Procedia PDF Downloads 413
1277 A Petri Net Model to Obtain the Throughput of Unreliable Production Lines in the Buffer Allocation Problem

Authors: Joselito Medina-Marin, Alexandr Karelin, Ana Tarasenko, Juan Carlos Seck-Tuoh-Mora, Norberto Hernandez-Romero, Eva Selene Hernandez-Gress

Abstract:

A production line designer faces with several challenges in manufacturing system design. One of them is the assignment of buffer slots in between every machine of the production line in order to maximize the throughput of the whole line, which is known as the Buffer Allocation Problem (BAP). The BAP is a combinatorial problem that depends on the number of machines and the total number of slots to be distributed on the production line. In this paper, we are proposing a Petri Net (PN) Model to obtain the throughput in unreliable production lines, based on PN mathematical tools and the decomposition method. The results obtained by this methodology are similar to those presented in previous works, and the number of machines is not a hard restriction.

Keywords: buffer allocation problem, Petri Nets, throughput, production lines

Procedia PDF Downloads 296
1276 Generation of Quasi-Measurement Data for On-Line Process Data Analysis

Authors: Hyun-Woo Cho

Abstract:

For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.

Keywords: data analysis, diagnosis, monitoring, process data, quality control

Procedia PDF Downloads 471
1275 Probability Sampling in Matched Case-Control Study in Drug Abuse

Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell

Abstract:

Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.

Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling

Procedia PDF Downloads 485
1274 The Application of Simulation Techniques to Enhance Nitroglycerin Production Efficiency: A Case Study of the Military Explosive Factory in Nakhon Sawan Province

Authors: Jeerasak Wisatphan, Nara Samattapapong

Abstract:

This study's goals were to enhance nitroglycerin manufacturing efficiency through simulation, recover nitroglycerin from the storage facility, and enhance nitroglycerine recovery and purge systems. It was found that the problem was nitroglycerin reflux. Therefore, the researcher created three alternatives to solve the problem. The system of Nitroglycerine Recovery and Purge was then simulated using the FlexSim program, and each alternative was tested. The results demonstrate that the alternative system-led Nitroglycerine Recovery and Nitroglycerine Purge System collaborate to produce Nitroglycerine, which is more efficient than other alternatives and can reduce production time. It can also improve the recovery of nitroglycerin. It also serves as a guideline for developing a real-world system and modeling it for training staff without wasting raw chemical materials or fuel energy.

Keywords: efficiency increase, nitroglycerine recovery and purge system, production improvement, simulation

Procedia PDF Downloads 111
1273 Methodology to Assess the Circularity of Industrial Processes

Authors: Bruna F. Oliveira, Teresa I. Gonçalves, Marcelo M. Sousa, Sandra M. Pimenta, Octávio F. Ramalho, José B. Cruz, Flávia V. Barbosa

Abstract:

The EU Circular Economy action plan, launched in 2020, is one of the major initiatives to promote the transition into a more sustainable industry. The circular economy is a popular concept used by many companies nowadays. Some industries are better forwarded to this reality than others, and the tannery industry is a sector that needs more attention due to its strong environmental impact caused by its dimension, intensive resources consumption, lack of recyclability, and second use of its products, as well as the industrial effluents generated by the manufacturing processes. For these reasons, the zero-waste goal and the European objectives are further being achieved. In this context, a need arises to provide an effective methodology that allows to determine the level of circularity of tannery companies. Regarding the complexity of the circular economy concept, few factories have a specialist in sustainability to assess the company’s circularity or have the ability to implement circular strategies that could benefit the manufacturing processes. Although there are several methodologies to assess circularity in specific industrial sectors, there is not an easy go-to methodology applied in factories aiming for cleaner production. Therefore, a straightforward methodology to assess the level of circularity, in this case of a tannery industry, is presented and discussed in this work, allowing any company to measure the impact of its activities. The methodology developed consists in calculating the Overall Circular Index (OCI) by evaluating the circularity of four key areas -energy, material, economy and social- in a specific factory. The index is a value between 0 and 1, where 0 means a linear economy, and 1 is a complete circular economy. Each key area has a sub-index, obtained through key performance indicators (KPIs) regarding each theme, and the OCI reflects the average of the four sub-indexes. Some fieldwork in the appointed company was required in order to obtain all the necessary data. By having separate sub-indexes, one can observe which areas are more linear than others. Thus, it is possible to work on the most critical areas by implementing strategies to increase the OCI. After these strategies are implemented, the OCI is recalculated to check the improvements made and any other changes in the remaining sub-indexes. As such, the methodology in discussion works through continuous improvement, constantly reevaluating and improving the circularity of the factory. The methodology is also flexible enough to be implemented in any industrial sector by adapting the KPIs. This methodology was implemented in a selected Portuguese small and medium-sized enterprises (SME) tannery industry and proved to be a relevant tool to measure the circularity level of the factory. It was witnessed that it is easier for non-specialists to evaluate circularity and identify possible solutions to increase its value, as well as learn how one action can impact their environment. In the end, energetic and environmental inefficiencies were identified and corrected, increasing the sustainability and circularity of the company. Through this work, important contributions were provided, helping the Portuguese SMEs to achieve the European and UN 2030 sustainable goals.

Keywords: circular economy, circularity index, sustainability, tannery industry, zero-waste

Procedia PDF Downloads 61
1272 Hyper Parameter Optimization of Deep Convolutional Neural Networks for Pavement Distress Classification

Authors: Oumaima Khlifati, Khadija Baba

Abstract:

Pavement distress is the main factor responsible for the deterioration of road structure durability, damage vehicles, and driver comfort. Transportation agencies spend a high proportion of their funds on pavement monitoring and maintenance. The auscultation of pavement distress was based on the manual survey, which was extremely time consuming, labor intensive, and required domain expertise. Therefore, the automatic distress detection is needed to reduce the cost of manual inspection and avoid more serious damage by implementing the appropriate remediation actions at the right time. Inspired by recent deep learning applications, this paper proposes an algorithm for automatic road distress detection and classification using on the Deep Convolutional Neural Network (DCNN). In this study, the types of pavement distress are classified as transverse or longitudinal cracking, alligator, pothole, and intact pavement. The dataset used in this work is composed of public asphalt pavement images. In order to learn the structure of the different type of distress, the DCNN models are trained and tested as a multi-label classification task. In addition, to get the highest accuracy for our model, we adjust the structural optimization hyper parameters such as the number of convolutions and max pooling, filers, size of filters, loss functions, activation functions, and optimizer and fine-tuning hyper parameters that conclude batch size and learning rate. The optimization of the model is executed by checking all feasible combinations and selecting the best performing one. The model, after being optimized, performance metrics is calculated, which describe the training and validation accuracies, precision, recall, and F1 score.

Keywords: distress pavement, hyperparameters, automatic classification, deep learning

Procedia PDF Downloads 77
1271 The Variable Sampling Interval Xbar Chart versus the Double Sampling Xbar Chart

Authors: Michael B. C. Khoo, J. L. Khoo, W. C. Yeong, W. L. Teoh

Abstract:

The Shewhart Xbar control chart is a useful process monitoring tool in manufacturing industries to detect the presence of assignable causes. However, it is insensitive in detecting small process shifts. To circumvent this problem, adaptive control charts are suggested. An adaptive chart enables at least one of the chart’s parameters to be adjusted to increase the chart’s sensitivity. Two common adaptive charts that exist in the literature are the double sampling (DS) Xbar and variable sampling interval (VSI) Xbar charts. This paper compares the performances of the DS and VSI Xbar charts, based on the average time to signal (ATS) criterion. The ATS profiles of the DS Xbar and VSI Xbar charts are obtained using the Mathematica and Statistical Analysis System (SAS) programs, respectively. The results show that the VSI Xbar chart is generally superior to the DS Xbar chart.

Keywords: adaptive charts, average time to signal, double sampling, charts, variable sampling interval

Procedia PDF Downloads 274
1270 A Passive Reaction Force Compensation for a Linear Motor Motion Stage Using Pre-Compressed Springs

Authors: Kim Duc Hoang, Hyeong Joon Ahn

Abstract:

Residual vibration of the system base due to a high-acceleration motion of a stage may reduce life and productivity of the manufacturing device. Although a passive RFC can reduce vibration of the system base, spring or dummy mass should be replaced to tune performance of the RFC. In this paper, we develop a novel concept of the passive RFC mechanism for a linear motor motion stage using pre-compressed springs. Dynamic characteristic of the passive RFC can be adjusted by pre-compression of the spring without exchanging the spring or dummy mass. First, we build a linear motor motion stage with pre-compressed springs. Then, the effect of the pre-compressed spring on the passive RFC is investigated by changing both pre-compressions and stiffness of springs. Finally, the effectiveness of the passive RFC using pre-compressed springs was verified with both simulations and experiments.

Keywords: linear motor motion stage, residual vibration, passive RFC, pre-compressed spring

Procedia PDF Downloads 343
1269 Knowledge Elicitation Approach for Formal Ontology Design: An Exploratory Study Applied in Industry for Knowledge Management

Authors: Ouassila Labbani-Narsis, Christophe Nicolle

Abstract:

Building formal ontologies remains a complex process for companies. In the literature, this process is based on the technical knowledge and expertise of domain experts, without further details on the used methodologies. Possible problems of disagreements between experts, expression of tacit knowledge related to high level know-how rarely verbalized, qualification of results by using cases, or simply adhesion of the group of experts, remain currently unsolved. This paper proposes a methodological approach based on knowledge elicitation for the conception of formal, consensual, and shared ontologies. The proposed approach is experimentally tested on industrial collaboration projects in the field of manufacturing (associating knowledge sources from multinational companies) and in the field of viticulture (associating explicit knowledge and implicit knowledge acquired through observation).

Keywords: collaborative ontology engineering, knowledge elicitation, knowledge engineering, knowledge management

Procedia PDF Downloads 69
1268 Replacement Time and Number of Preventive Maintenance Actions for Second-Hand Device

Authors: Wen Liang Chang

Abstract:

In this study, the optimal replacement time and number of preventive maintenance (PM) actions were investigated for a second-hand device. Suppose that a user intends to use a second-hand device for manufacturing products, and that the device is replaced with a new one. Any device failure is rectified through minimal repair, thereby incurring a fixed repair cost to the user. If the new device fails within the FRW period, minimal repair is performed at no cost to the user. After the FRW expires, a failed device is repaired and the cost of repair is incurred by the user. In this study, two profit models were developed, and the optimal replacement time and number of PM actions were determined to maximize profits. Finally, the influence of the optimal replacement time and number of PM actions were elaborated on, using numerical examples.

Keywords: second-hand device, preventive maintenance, replacement time, device failure

Procedia PDF Downloads 461