Search results for: autonomous beach cleaning machine
743 Optimization of Alkali Silicate Glass Heat Treatment for the Improvement of Thermal Expansion and Flexural Strength
Authors: Stephanie Guerra-Arias, Stephani Nevarez, Calvin Stewart, Rachel Grodsky, Denis Eichorst
Abstract:
The objective of this study is to describe the framework for optimizing the heat treatment of alkali silicate glasses, to enhance the performance of hermetic seals in extreme environments. When connectors are exposed to elevated temperatures, residual stresses develop due to the mismatch of thermal expansions between the glass, metal pin, and metal shell. Excessive thermal expansion mismatch compromises the reliability of hermetic seals. In this study, a series of heat treatment schedules will be performed on two commercial sealing glasses (one conventional sealing glass and one crystallizable sealing glass) using a design of experiments (DOE) approach. The coefficient of thermal expansion (CTE) will be measured pre- and post-heat treatment using thermomechanical analysis (TMA). Afterwards, the flexural strength of the specimen will be measured using a four-point bend fixture mounted in a static universal testing machine. The measured material properties will be statistically analyzed using MiniTab software to determine which factors of the heat treatment process have a strong correlation to the coefficient of thermal expansion and/or flexural strength. Finally, a heat-treatment will be designed and tested to ensure the optimal performance of the hermetic seals in connectors.Keywords: glass-ceramics, design of experiment, hermetic connectors, material characterization
Procedia PDF Downloads 150742 Diversity Indices as a Tool for Evaluating Quality of Water Ways
Authors: Khadra Ahmed, Khaled Kheireldin
Abstract:
In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.Keywords: planktons, diversity indices, water quality index, water ways
Procedia PDF Downloads 518741 Selective Laser Melting (SLM) Process and Its Influence on the Machinability of TA6V Alloy
Authors: Rafał Kamiński, Joel Rech, Philippe Bertrand, Christophe Desrayaud
Abstract:
Titanium alloys are among the most important material in the aircraft industry, due to its low density, high strength, and corrosion resistance. However, these alloys are considered as difficult to machine because they have poor thermal properties and high reactivity with cutting tools. The Selective Laser Melting (SLM) process becomes even more popular through industry since it enables the design of new complex components, that cannot be manufactured by standard processes. However, the high temperature reached during the melting phase as well as the several rapid heating and cooling phases, due to the movement of the laser, induce complex microstructures. These microstructures differ from conventional equiaxed ones obtained by casting+forging. Parts obtained by SLM have to be machined in order calibrate the dimensions and the surface roughness of functional surfaces. The ball milling technique is widely applied to finish complex shapes. However, the machinability of titanium is strongly influenced by the microstructure. So the objective of this work is to investigate the influence of the SLM process, i.e. microstructure, on the machinability of titanium, compared to conventional forming processes. The machinability is analyzed by measuring surface roughness, cutting forces, cutting tool wear for a range of cutting conditions (depth of cut ap, feed per tooth fz, spindle speed N) in accordance with industrial practices.Keywords: ball milling, microstructure, surface roughness, titanium
Procedia PDF Downloads 297740 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct
Procedia PDF Downloads 225739 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology
Procedia PDF Downloads 242738 Supported Gold Nanocatalysts for CO Oxidation in Mainstream Cigarette Smoke
Authors: Krasimir Ivanov, Dimitar Dimitrov, Tatyana Tabakova, Stefka Kirkova, Anna Stoilova, Violina Angelova
Abstract:
It has been suggested that nicotine, CO and tar in mainstream smoke are the most important substances and have been judged as the most harmful compounds, responsible for the health hazards of smoking. As nicotine is extremely important for smoking qualities of cigarettes and the tar yield in the tobacco smoke is significantly reduced due to the use of filters with various content and design, the main efforts of cigarettes researchers and manufacturers are related to the search of opportunities for CO content reduction. Highly active ceria supported gold catalyst was prepared by the deposition-precipitation method, and the possibilities for CO oxidation in the synthetic gaseous mixture were evaluated using continuous flow equipment with fixed bed glass reactor at atmospheric pressure. The efficiently of the catalyst in CO oxidation in the real cigarette smoke was examined by a single port, puf-by-puff smoking machine. Quality assessment of smoking using cigarette holder containing catalyst was carried out. It was established that the catalytic activity toward CO oxidation in cigarette smoke rapidly decreases from 70% for the first cigarette to nearly zero for the twentieth cigarette. The present study shows that there are two critical factors which do not permit the successful use of catalysts to reduce the CO content in the mainstream cigarette smoke: (i) significant influence of the processes of adsorption and oxidation on the main characteristics of tobacco products and (ii) rapid deactivation of the catalyst due to the covering of the catalyst’s grains with condensate.Keywords: cigarette smoke, CO oxidation, gold catalyst, mainstream
Procedia PDF Downloads 219737 A Comparison between Shear Bond Strength of VMK Master Porcelain with Three Base-Metal Alloys (Ni-Cr-T3, Verabond, Super Cast) and One Noble Alloy (X-33) in Metal-Ceramic Restorations
Authors: Ammar Neshati, Elham Hamidi Shishavan
Abstract:
Statement of Problem: The increase in the use of metal-ceramic restorations and a high prevalence of porcelain chipping entails introducing an alloy which is more compatible with porcelain and which causes a stronger bond between the two. This study is to compare shear bond strength of three base-metal alloys and one noble alloy with the common VMK Master Porcelain. Materials and Method: Three different groups of base-metal alloys (Ni-cr-T3, Super Cast, Verabond) and one group of noble alloy (x-33) were selected. The number of alloys in each group was 15. All the groups went through the casting process and change from wax pattern into metal disks. Then, VMK Master Porcelain was fired on each group. All the specimens were put in the UTM and a shear force was loaded until a fracture occurred. The fracture force was then recorded by the machine. The data was subjected to SPSS Version 16 and One-Way ANOVA was run to compare shear strength between the groups. Furthermore, the groups were compared two by two through running Tukey test. Results: The findings of this study revealed that shear bond strength of Ni-Cr-T3 alloy was higher than the three other alloys (94 Mpa or 330 N). Super Cast alloy had the second greatest shear bond strength (80. 87 Mpa or 283.87 N). Both Verabond (69.66 Mpa or 245 N) and x-33 alloys (66.53 Mpa or 234 N) took the third place. Conclusion: Ni-Cr-T3 with VMK Master Porcelain has the greatest shear bond strength. Therefore, the use of this low-cost alloy is recommended in metal-ceramic restorations.Keywords: shear bond, base-metal alloy, noble alloy, porcelain
Procedia PDF Downloads 488736 Human-factor and Ergonomics in Bottling Lines
Authors: Parameshwaran Nair
Abstract:
Filling and packaging lines for bottling of beverages into glass, PET or aluminum containers require specialized expertise and a different configuration of equipment like – Filler, Warmer, Labeller, Crater/Recrater, Shrink Packer, Carton Erector, Carton Sealer, Date Coder, Palletizer, etc. Over the period of time, the packaging industry has evolved from manually operated single station machines to highly automized high-speed lines. Human factor and ergonomics have gained significant consideration in this course of transformation. A pre-requisite for such bottling lines, irrespective of the container type and size, is to be suitable for multi-format applications. It should also be able to handle format changeovers with minimal adjustment. It should have variable capacity and speeds, for providing great flexibility of use in managing accumulation times as a function of production characteristics. In terms of layout as well, it should demonstrate flexibility for operator movement and access to machine areas for maintenance. Packaging technology during the past few decades has risen to these challenges by a series of major breakthroughs interspersed with periods of refinement and improvement. The milestones are many and varied and are described briefly in this paper. In order to have a brief understanding of the human factor and ergonomics in the modern packaging lines, this paper, highlights the various technologies, design considerations and statutory requirements in packaging equipment for different types of containers used in India.Keywords: human-factor, ergonomics, bottling lines, automized high-speed lines
Procedia PDF Downloads 437735 Major Depressive Disorder: Diagnosis based on Electroencephalogram Analysis
Authors: Wajid Mumtaz, Aamir Saeed Malik, Syed Saad Azhar Ali, Mohd Azhar Mohd Yasin
Abstract:
In this paper, a technique based on electroencephalogram (EEG) analysis is presented, aiming for diagnosing major depressive disorder (MDD) among a potential population of MDD patients and healthy controls. EEG is recognized as a clinical modality during applications such as seizure diagnosis, index for anesthesia, detection of brain death or stroke. However, its usability for psychiatric illnesses such as MDD is less studied. Therefore, in this study, for the sake of diagnosis, 2 groups of study participants were recruited, 1) MDD patients, 2) healthy people as controls. EEG data acquired from both groups were analyzed involving inter-hemispheric asymmetry and composite permutation entropy index (CPEI). To automate the process, derived quantities from EEG were utilized as inputs to classifier such as logistic regression (LR) and support vector machine (SVM). The learning of these classification models was tested with a test dataset. Their learning efficiency is provided as accuracy of classifying MDD patients from controls, their sensitivities and specificities were reported, accordingly (LR =81.7 % and SVM =81.5 %). Based on the results, it is concluded that the derived measures are indicators for diagnosing MDD from a potential population of normal controls. In addition, the results motivate further exploring other measures for the same purpose.Keywords: major depressive disorder, diagnosis based on EEG, EEG derived features, CPEI, inter-hemispheric asymmetry
Procedia PDF Downloads 546734 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments
Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein
Abstract:
Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.Keywords: virtual reality, effective computing, effective VR, emotion-based effective physiological database
Procedia PDF Downloads 233733 On Exploring Search Heuristics for improving the efficiency in Web Information Extraction
Authors: Patricia Jiménez, Rafael Corchuelo
Abstract:
Nowadays the World Wide Web is the most popular source of information that relies on billions of on-line documents. Web mining is used to crawl through these documents, collect the information of interest and process it by applying data mining tools in order to use the gathered information in the best interest of a business, what enables companies to promote theirs. Unfortunately, it is not easy to extract the information a web site provides automatically when it lacks an API that allows to transform the user-friendly data provided in web documents into a structured format that is machine-readable. Rule-based information extractors are the tools intended to extract the information of interest automatically and offer it in a structured format that allow mining tools to process it. However, the performance of an information extractor strongly depends on the search heuristic employed since bad choices regarding how to learn a rule may easily result in loss of effectiveness and/or efficiency. Improving search heuristics regarding efficiency is of uttermost importance in the field of Web Information Extraction since typical datasets are very large. In this paper, we employ an information extractor based on a classical top-down algorithm that uses the so-called Information Gain heuristic introduced by Quinlan and Cameron-Jones. Unfortunately, the Information Gain relies on some well-known problems so we analyse an intuitive alternative, Termini, that is clearly more efficient; we also analyse other proposals in the literature and conclude that none of them outperforms the previous alternative.Keywords: information extraction, search heuristics, semi-structured documents, web mining.
Procedia PDF Downloads 335732 Nuancing the Indentured Migration in Amitav Ghosh's Sea of Poppies
Authors: Murari Prasad
Abstract:
This paper is motivated by the implications of indentured migration depicted in Amitav Ghosh’s critically acclaimed novel, Sea of Poppies (2008). Ghosh’s perspective on the experiences of North Indian indentured labourers moving from their homeland to a distant and unknown location across the seas suggests a radical attitudinal change among the migrants on board the Ibis, a schooner chartered to carry the recruits from Calcutta to Mauritius in the late 1830s. The novel unfolds the life-altering trauma of the bonded servants, including their efforts to maintain a sense of self while negotiating significant social and cultural transformations during the voyage which leads to the breakdown of familiar life-worlds. Equally, the migrants are introduced to an alternative network of relationships to ensure their survival away from land. They relinquish their entrenched beliefs and prejudices and commit themselves to a new brotherhood formed by ‘ship siblings.’ With the official abolition of direct slavery in 1833, the supply of cheap labour to the sugar plantation in British colonies as far-flung as Mauritius and Fiji to East Africa and the Caribbean sharply declined. Around the same time, China’s attempt to prohibit the illegal importation of opium from British India into China threatened the lucrative opium trade. To run the ever-profitable plantation colonies with cheap labour, Indian peasants, wrenched from their village economies, were indentured to plantations as girmitiyas (vernacularized from ‘agreement’) by the colonial government using the ploy of an optional form of recruitment. After the British conquest of the Isle of France in 1810, Mauritius became Britain’s premier sugar colony bringing waves of Indian immigrants to the island. In the articulations of their subjectivities one notices how the recruits cope with the alienating drudgery of indenture, mitigate the hardships of the voyage and forge new ties with pragmatic acts of cultural syncretism in a forward-looking autonomous community of ‘ship-siblings’ following the fracture of traditional identities. This paper tests the hypothesis that Ghosh envisions a kind of futuristic/utopian political collectivity in a hierarchically rigid, racially segregated and identity-obsessed world. In order to ground the claim and frame the complex representations of alliance and love across the boundaries of caste, religion, gender and nation, the essential methodology here is a close textual analysis of the novel. This methodology will be geared to explicate the utopian futurity that the novel gestures towards by underlining new regulations of life during voyage and dissolution of multiple differences among the indentured migrants on board the Ibis.Keywords: indenture, colonial, opium, sugar plantation
Procedia PDF Downloads 398731 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform
Authors: David Jurado, Carlos Ávila
Abstract:
Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis
Procedia PDF Downloads 83730 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 17729 Crop Classification using Unmanned Aerial Vehicle Images
Authors: Iqra Yaseen
Abstract:
One of the well-known areas of computer science and engineering, image processing in the context of computer vision has been essential to automation. In remote sensing, medical science, and many other fields, it has made it easier to uncover previously undiscovered facts. Grading of diverse items is now possible because of neural network algorithms, categorization, and digital image processing. Its use in the classification of agricultural products, particularly in the grading of seeds or grains and their cultivars, is widely recognized. A grading and sorting system enables the preservation of time, consistency, and uniformity. Global population growth has led to an increase in demand for food staples, biofuel, and other agricultural products. To meet this demand, available resources must be used and managed more effectively. Image processing is rapidly growing in the field of agriculture. Many applications have been developed using this approach for crop identification and classification, land and disease detection and for measuring other parameters of crop. Vegetation localization is the base of performing these task. Vegetation helps to identify the area where the crop is present. The productivity of the agriculture industry can be increased via image processing that is based upon Unmanned Aerial Vehicle photography and satellite. In this paper we use the machine learning techniques like Convolutional Neural Network, deep learning, image processing, classification, You Only Live Once to UAV imaging dataset to divide the crop into distinct groups and choose the best way to use it.Keywords: image processing, UAV, YOLO, CNN, deep learning, classification
Procedia PDF Downloads 107728 Material Handling Equipment Selection Using Fuzzy AHP Approach
Authors: Priyanka Verma, Vijaya Dixit, Rishabh Bajpai
Abstract:
This research paper is aimed at selecting appropriate material handling equipment among the given choices so that the automation level in material handling can be enhanced. This work is a practical case scenario of material handling systems in consumer electronic appliances manufacturing organization. The choices of material handling equipment among which the decision has to be made are Automated Guided Vehicle’s (AGV), Autonomous Mobile Robots (AMR), Overhead Conveyer’s (OC) and Battery Operated Trucks/Vehicle’s (BOT). There is a need of attaining a certain level of automation in order to reduce human interventions in the organization. This requirement of achieving certain degree of automation can be attained by material handling equipment’s mentioned above. The main motive for selecting above equipment’s for study was solely based on corporate financial strategy of investment and return obtained through that investment made in stipulated time framework. Since the low cost automation with respect to material handling devices has to be achieved hence these equipment’s were selected. Investment to be done on each unit of this equipment is less than 20 lakh rupees (INR) and the recovery period is less than that of five years. Fuzzy analytic hierarchic process (FAHP) is applied here for selecting equipment where the four choices are evaluated on basis of four major criteria’s and 13 sub criteria’s, and are prioritized on the basis of weight obtained. The FAHP used here make use of triangular fuzzy numbers (TFN). The inability of the traditional AHP in order to deal with the subjectiveness and impreciseness in the pair-wise comparison process has been improved in the FAHP. The range of values for general rating purposes for all decision making parameters is kept between 0 and 1 on the basis of expert opinions captured on shop floor. These experts were familiar with operating environment and shop floor activity control. Instead of generating exact value the FAHP generates the ranges of values to accommodate the uncertainty in decision-making process. The four major criteria’s selected for the evaluation of choices of material handling equipment’s available are materials, technical capabilities, cost and other features. The thirteen sub criteria’s listed under these following four major criteria’s are weighing capacity, load per hour, material compatibility, capital cost, operating cost and maintenance cost, speed, distance moved, space required, frequency of trips, control required, safety and reliability issues. The key finding shows that among the four major criteria selected, cost is emerged as the most important criteria and is one of the key decision making aspect on the basis of which material equipment selection is based on. While further evaluating the choices of equipment available for each sub criteria it is found that AGV scores the highest weight in most of the sub-criteria’s. On carrying out complete analysis the research shows that AGV is the best material handling equipment suiting all decision criteria’s selected in FAHP and therefore it is beneficial for the organization to carry out automated material handling in the facility using AGV’s.Keywords: fuzzy analytic hierarchy process (FAHP), material handling equipment, subjectiveness, triangular fuzzy number (TFN)
Procedia PDF Downloads 434727 Continuous Improvement as an Organizational Capability in the Industry 4.0 Era
Authors: Lodgaard Eirin, Myklebust Odd, Eleftheriadis Ragnhild
Abstract:
Continuous improvement is becoming increasingly a prerequisite for manufacturing companies to remain competitive in a global market. In addition, future survival and success will depend on the ability to manage the forthcoming digitalization transformation in the industry 4.0 era. Industry 4.0 promises substantially increased operational effectiveness, were all equipment are equipped with integrated processing and communication capabilities. Subsequently, the interplay of human and technology will evolve and influence the range of worker tasks and demands. Taking into account these changes, the concept of continuous improvement must evolve accordingly. Based on a case study from manufacturing industry, the purpose of this paper is to point out what the concept of continuous improvement will meet and has to take into considering when entering the 4th industrial revolution. In the past, continuous improvement has the focus on a culture of sustained improvement targeting the elimination of waste in all systems and processes of an organization by involving everyone. Today, it has to be evolved into the forthcoming digital transformation and the increased interplay of human and digital communication system to reach its full potential. One main findings of this study, is how digital communication systems will act as an enabler to strengthen the continuous improvement process, by moving from collaboration within individual teams to interconnection of teams along the product value chain. For academics and practitioners, it will help them to identify and prioritize their steps towards an industry 4.0 implementation integrated with focus on continuous improvement.Keywords: continuous improvement, digital communication system, human-machine-interaction, industry 4.0, team perfomance
Procedia PDF Downloads 204726 Influence of Exfoliated Graphene Nanoplatelets on Thermal Stability of Polypropylene Reinforced Hybrid Graphen-rice Husk Nanocomposites
Authors: Obinna Emmanuel Ezenkwa, Sani Amril Samsudin, Azman Hassan, Ede Anthony
Abstract:
A major challenge of polypropylene (PP) in high-heat application areas is its poor thermal stability. Under high temperature, PP burns readily with high degradation temperature and can self-ignite. In this study, PP is reinforced with hybrid filler of graphene (xGNP) and rice husk (RH) with RH at 15 wt%, and xGNP varied at 0.5, 1.0, 1.5, 2.0, 2.5, and 3.0 parts per hundred (phr) of the composite. Compatibilizer MAPP was also added in each sample at 4phr of the composite. Sample formulations were melt-blended using twin screw extruder and injection moulding machine. At xGNP optimum content of 1.5 phr, hybrid PP/RH/G1.5/MAPP nanocomposite increased in thermal stability by 24 °C and 30 °C compared to pure PP and unhybridized PP/RH composite respectively; char residue increased by 513% compared to pure PP and degree of crystallization (Xc) increased from 35.4% to 36.4%. The observed thermal properties enhancement in the hybrid nanocomposites can be related to the high surface area, gap-filling effect and exfoliation characteristics of the graphene nanofiller which worked in synergy with rice husk fillers in reinforcing PP. This study therefore, shows that graphene nanofiller inclusion in polymer composites fabrication can enhance the thermal stability of polyolefins for high heat applications.Keywords: polymer nanocomposites, thermal stability, exfoliation, hybrid fillers, polymer reinforcement
Procedia PDF Downloads 39725 TRAC: A Software Based New Track Circuit for Traffic Regulation
Authors: Jérôme de Reffye, Marc Antoni
Abstract:
Following the development of the ERTMS system, we think it is interesting to develop another software-based track circuit system which would fit secondary railway lines with an easy-to-work implementation and a low sensitivity to rail-wheel impedance variations. We called this track circuit 'Track Railway by Automatic Circuits.' To be internationally implemented, this system must not have any mechanical component and must be compatible with existing track circuit systems. For example, the system is independent from the French 'Joints Isolants Collés' that isolate track sections from one another, and it is equally independent from component used in Germany called 'Counting Axles,' in French 'compteur d’essieux.' This track circuit is fully interoperable. Such universality is obtained by replacing the train detection mechanical system with a space-time filtering of train position. The various track sections are defined by the frequency of a continuous signal. The set of frequencies related to the track sections is a set of orthogonal functions in a Hilbert Space. Thus the failure probability of track sections separation is precisely calculated on the basis of signal-to-noise ratio. SNR is a function of the level of traction current conducted by rails. This is the reason why we developed a very powerful algorithm to reject noise and jamming to obtain an SNR compatible with the precision required for the track circuit and SIL 4 level. The SIL 4 level is thus reachable by an adjustment of the set of orthogonal functions. Our major contributions to railway engineering signalling science are i) Train space localization is precisely defined by a calibration system. The operation bypasses the GSM-R radio system of the ERTMS system. Moreover, the track circuit is naturally protected against radio-type jammers. After the calibration operation, the track circuit is autonomous. ii) A mathematical topology adapted to train space localization by following the train through a linear time filtering of the received signal. Track sections are numerically defined and can be modified with a software update. The system was numerically simulated, and results were beyond our expectations. We achieved a precision of one meter. Rail-ground and rail-wheel impedance sensitivity analysis gave excellent results. Results are now complete and ready to be published. This work was initialised as a research project of the French Railways developed by the Pi-Ramses Company under SNCF contract and required five years to obtain the results. This track circuit is already at Level 3 of the ERTMS system, and it will be much cheaper to implement and to work. The traffic regulation is based on variable length track sections. As the traffic growths, the maximum speed is reduced, and the track section lengths are decreasing. It is possible if the elementary track section is correctly defined for the minimum speed and if every track section is able to emit with variable frequencies.Keywords: track section, track circuits, space-time crossing, adaptive track section, automatic railway signalling
Procedia PDF Downloads 331724 Mechanical Characterization of Porcine Skin with the Finite Element Method Based Inverse Optimization Approach
Authors: Djamel Remache, Serge Dos Santos, Michael Cliez, Michel Gratton, Patrick Chabrand, Jean-Marie Rossi, Jean-Louis Milan
Abstract:
Skin tissue is an inhomogeneous and anisotropic material. Uniaxial tensile testing is one of the primary testing techniques for the mechanical characterization of skin at large scales. In order to predict the mechanical behavior of materials, the direct or inverse analytical approaches are often used. However, in case of an inhomogeneous and anisotropic material as skin tissue, analytical approaches are not able to provide solutions. The numerical simulation is thus necessary. In this work, the uniaxial tensile test and the FEM (finite element method) based inverse method were used to identify the anisotropic mechanical properties of porcine skin tissue. The uniaxial tensile experiments were performed using Instron 8800 tensile machine®. The uniaxial tensile test was simulated with FEM, and then the inverse optimization approach (or the inverse calibration) was used for the identification of mechanical properties of the samples. Experimentally results were compared to finite element solutions. The results showed that the finite element model predictions of the mechanical behavior of the tested skin samples were well correlated with experimental results.Keywords: mechanical skin tissue behavior, uniaxial tensile test, finite element analysis, inverse optimization approach
Procedia PDF Downloads 408723 Theoretical Performance of a Sustainable Clean Energy On-Site Generation Device to Convert Consumers into Producers and Its Possible Impact on Electrical National Grids
Authors: Eudes Vera
Abstract:
In this paper, a theoretical evaluation is carried out of the performance of a forthcoming fuel-less clean energy generation device, the Air Motor. The underlying physical principles that support this technology are succinctly described. Examples of the machine and theoretical values of input and output powers are also given. In addition, its main features like portability, on-site energy generation and delivery, miniaturization of generation plants, efficiency, and scaling down of the whole electric infrastructure are discussed. The main component of the Air Motor, the Thermal Air Turbine, generates useful power by converting in mechanical energy part of the thermal energy contained in a fan-produced airflow while leaving intact its kinetic energy. Due to this fact an air motor can contain a long succession of identical air turbines and the total power generated out of a single airflow can be very large, as well as its mechanical efficiency. It is found using the corresponding formulae that the mechanical efficiency of this device can be much greater than 100%, while its thermal efficiency is always less than 100%. On account of its multiple advantages, the Air Motor seems to be the perfect device to convert energy consumers into energy producers worldwide. If so, it would appear that current national electrical grids would no longer be necessary, because it does not seem practical or economical to bring the energy from far-away distances while it can be generated and consumed locally at the consumer’s premises using just the thermal energy contained in the ambient air.Keywords: electrical grid, clean energy, renewable energy, in situ generation and delivery, generation efficiency
Procedia PDF Downloads 175722 Diasporic Literature
Authors: Shamsher Singh
Abstract:
The Diaspora Literature involves a concept of native land, from where the displacement occurs and a record of harsh journeys undertaken on account of economic compulsions. Basically, Diaspora is a splintered community living in eviction. The scattering (initially) signifies the location of a fluid human autonomous space involving a complex set of negotiations and exchange between the nostalgia and desire for the native land and the making of a new home, adapting to the relationships between the minority and majority, being spokes persons for minority rights and their people back native place and significantly transacting the Contact Zone - a space changed with the possibility of multiple challenges. They write in the background of the sublime qualities of their homeland and, at the same time, try to fit themselves into the traditions and cultural values of other strange communities or land. It also serves as an interconnection of the various cultures involved, and it is used to understand the customs of different cultures and countries; it is also a source of inspiration globally. Although diasporic literature originated back in the 20th century, it spread to other countries like Britain, Canada, America, Denmark, Netherland, Australia, Kenya, Sweden, Kuwait and different parts of Europe. Meaning of Diaspora is the combination of two words which means the movement of people away from their own country or motherland. From a historical point of view, the ‘Diaspora’ is often associated with Jewish bigotry. At the moment, the Diaspora is used for the dispersal of social or cultural groups. This group will be living in two different streams of cultures at the same time. One who left behind his culture and the other has to adapt himself to new cultural situations. The diasporic mind hangs between his birth land and place of work at the same time. A person’s mental state, living in dual existence, gives birth to Dysphoria sensation. Litterateurs had different experiences in this type of sensation e.g., social, universal, political, economic and experiences from the strange land. The struggle of these experiences is seen in diasporic literature. When a person moves to different land or country to fulfill his dreams, the discrimination of language, work and other difficulties with strangers make his relationship more emotional and deeper into his past. These past memories and relations create more difficulties in settling in a foreign land. He lives there physically, but his mental state is in his past constantly, and he ends up his life in those background memories. A person living in Diaspora is actually a dual visionary man. Although this double vision expands his global consciousness, due to this vision, he gains judgemental qualities to understand others. At the same time, he weighs his respect for his native land and the situations of foreign land he experiences, and he finds it difficult to survive in those conditions. It can be said that diaspora literature indicates a person or social organization who lives dual life inquisition structure which becomes the cause of diasporic literature.Keywords: homeland sickness, language problem, quest for identity, materialistic desire
Procedia PDF Downloads 67721 Organic Substance Removal from Pla-Som Family Industrial Wastewater through APCW System
Authors: W. Wararam, K. Angchanpen, T. Pattamapitoon, K. Chunkao, O. Phewnil, M. Srichomphu, T. Jinjaruk
Abstract:
The research focused on the efficiency for treating high organic wastewater from pla-som production process by anaerobic tanks, oxidation ponds and constructed wetland treatment systems (APCW). The combined system consisted of 50-mm plastic screen, five 5.8 m3 oil-grease trap tanks (2-day hydraulic retention time; HRT), four 4.3 m3 anaerobic tanks (1-day HRT), 16.7 m3 oxidation pond no.1 (7-day HRT), 12.0 m3 oxidation pond no.2 (3-day HRT), and 8.2 m3 constructed wetland plot (1-day HRT). After washing fresh raw fishes, they were sliced in small pieces and were converted into ground fish meat by blender machine. The fish meat was rinsed for 8 rounds: 1, 2, 3, 5, 6 and 7 by tap water and 4 and 8 by rice-wash-water, before mixing with salt, garlic, steamed rice and monosodium glutamate, followed by plastic wrapping for 72-hour of edibility. During pla-som production processing, the rinsed wastewater about 5 m3/day was fed to the treatment systems and fully stagnating storage in its components. The result found that, 1) percentage of treatment efficiency for BOD, COD, TDS and SS were 93, 95, 32 and 98 respectively, 2) the treatment was conducted with 500-kg raw fishes along with full equipment of high organic wastewater treatment systems, 3) the trend of the treatment efficiency and quantity in all indicators was similarly processed and 4) the small pieces of fish meat and fish blood were needed more than 3-day HRT in anaerobic digestion process.Keywords: organic substance, Pla-Som family industry, wastewater, APCW system
Procedia PDF Downloads 358720 Improvement of Microstructure, Wear and Mechanical Properties of Modified G38NiCrMo8-4-4 Steel Used in Mining Industry
Authors: Mustafa Col, Funda Gul Koc, Merve Yangaz, Eylem Subasi, Can Akbasoglu
Abstract:
G38NiCrMo8-4-4 steel is widely used in mining industries, machine parts, gears due to its high strength and toughness properties. In this study, microstructure, wear and mechanical properties of G38NiCrMo8-4-4 steel modified with boron used in the mining industry were investigated. For this purpose, cast materials were alloyed by melting in an induction furnace to include boron with the rates of 0 ppm, 15 ppm, and 50 ppm (wt.) and were formed in the dimensions of 150x200x150 mm by casting into the sand mould. Homogenization heat treatment was applied to the specimens at 1150˚C for 7 hours. Then all specimens were austenitized at 930˚C for 1 hour, quenched in the polymer solution and tempered at 650˚C for 1 hour. Microstructures of the specimens were investigated by using light microscope and SEM to determine the effect of boron and heat treatment conditions. Changes in microstructure properties and material hardness were obtained due to increasing boron content and heat treatment conditions after microstructure investigations and hardness tests. Wear tests were carried out using a pin-on-disc tribometer under dry sliding conditions. Charpy V notch impact test was performed to determine the toughness properties of the specimens. Fracture and worn surfaces were investigated with scanning electron microscope (SEM). The results show that boron element has a positive effect on the hardness and wear properties of G38NiCrMo8-4-4 steel.Keywords: G38NiCrMo8-4-4 steel, boron, heat treatment, microstructure, wear, mechanical properties
Procedia PDF Downloads 195719 The Impact of Artificial Intelligence in the Development of Textile and Fashion Industry
Authors: Basem Kamal Abasakhiroun Farag
Abstract:
Fashion, like many other areas of design, has undergone numerous developments over the centuries. The aim of the article is to recognize and evaluate the importance of advanced technologies in fashion design and to examine how they are transforming the role of contemporary fashion designers by transforming the creative process. It also discusses how contemporary culture is involved in such developments and how it influences fashion design in terms of conceptualization and production. The methodology used is based on examining various examples of the use of technology in fashion design and drawing parallels between what was feasible then and what is feasible today. Comparison of case studies, examples of existing fashion designs and experiences with craft methods; We therefore observe patterns that help us predict the direction of future developments in this area. Discussing the technological elements in fashion design helps us understand the driving force behind the trend. The research presented in the article shows that there is a trend towards significantly increasing interest and progress in the field of fashion technology, leading to the emergence of hybrid artisanal methods. In summary, as fashion technologies advance, their role in clothing production is becoming increasingly important, extending far beyond the humble sewing machine.Keywords: fashion, identity, such, textiles ambient intelligence, proximity sensors, shape memory materials, sound sensing garments, wearable technology bio textiles, fashion trends, nano textiles, new materials, smart textiles, techno textiles fashion design, functional aesthetics, 3D printing.
Procedia PDF Downloads 67718 Peruvian Diagnostic Reference Levels for Patients Undergoing Different X-Rays Procedures
Authors: Andres Portocarrero Bonifaz, Caterina Sandra Camarena Rodriguez, Ricardo Palma Esparza, Nicolas Antonio Romero Carlos
Abstract:
Reference levels for common X-rays procedures have been set in many protocols. In Peru, during quality control tests, the dose tolerance is set by these international recommendations. Nevertheless, further studies can be made to assess the national reality and relate dose levels with different parameters such as kV, mA/mAs, exposure time, type of processing (digital, digitalized or conventional), etc. In this paper three radiologic procedures were taken into account for study, general X-rays (fixed and mobile), intraoral X-rays (fixed, mobile and portable) and mammography. For this purpose, an Unfors Xi detector was used; the dose was measured at a focus - detector distance which varied depending on the procedure, and was corrected afterward to find the surface entry dose. The data used in this paper was gathered over a period of over 3 years (2015-2018). In addition, each X-ray machine was taken into consideration only once. The results hope to achieve a new standard which reflects the local practice, and address the issues of the ‘Bonn Call for Action’ in Peru. For this purpose, the 75% percentile of the dose of each radiologic procedure was calculated. In future quality control services, those machines with dose values higher than the selected threshold should be informed that they surpass the reference dose levels established in comparison other radiological centers in the country.Keywords: general X-rays, intraoral X-rays, mammography, reference dose levels
Procedia PDF Downloads 155717 Developing a Comprehensive Green Building Rating System Tailored for Nigeria: Analyzing International Sustainable Rating Systems to Create Environmentally Responsible Standards for the Nigerian Construction Industry and Built Environment
Authors: Azeez Balogun
Abstract:
Inexperienced building score practices are continually evolving and vary across areas. Yet, a few middle ideas stay steady, such as website selection, design, energy efficiency, water and material conservation, indoor environmental great, operational optimization, and waste discount. The essence of green building lies inside the optimization of 1 or more of those standards. This paper conducts a comparative analysis of 7 extensively recognized sustainable score structures—BREEAM, CASBEE, green GLOBES, inexperienced superstar, HK-BEAM, IGBC green homes, and LEED—based totally on the perceptions and opinions of stakeholders in Nigeria certified in green constructing rating systems. The purpose is to pick out and adopt an appropriate green building rating device for Nigeria. Numerous components of those systems had been tested to determine the high-quality health of the Nigerian built environment. The findings imply that LEED, the important machine within the USA and Canada, is the most suitable for Nigeria due to its sturdy basis, extensive funding, and confirmed blessings. LEED obtained the highest rating of eighty out of one hundred points on this assessment.Keywords: structure, built surroundings, inexperienced building score gadget, Nigeria Inexperienced Constructing Council, sustainability
Procedia PDF Downloads 27716 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.Keywords: injection molding, shrinkage, six sigma, Taguchi parameter design
Procedia PDF Downloads 178715 The Use of Layered Neural Networks for Classifying Hierarchical Scientific Fields of Study
Authors: Colin Smith, Linsey S Passarella
Abstract:
Due to the proliferation and decentralized nature of academic publication, no widely accepted scheme exists for organizing papers by their scientific field of study (FoS) to the author’s best knowledge. While many academic journals require author provided keywords for papers, these keywords range wildly in scope and are not consistent across papers, journals, or field domains, necessitating alternative approaches to paper classification. Past attempts to perform field-of-study (FoS) classification on scientific texts have largely used a-hierarchical FoS schemas or ignored the schema’s inherently hierarchical structure, e.g. by compressing the structure into a single layer for multi-label classification. In this paper, we introduce an application of a Layered Neural Network (LNN) to the problem of performing supervised hierarchical classification of scientific fields of study (FoS) on research papers. In this approach, paper embeddings from a pretrained language model are fed into a top-down LNN. Beginning with a single neural network (NN) for the highest layer of the class hierarchy, each node uses a separate local NN to classify the subsequent subfield child node(s) for an input embedding of concatenated paper titles and abstracts. We compare our LNN-FOS method to other recent machine learning methods using the Microsoft Academic Graph (MAG) FoS hierarchy and find that the LNN-FOS offers increased classification accuracy at each FoS hierarchical level.Keywords: hierarchical classification, layer neural network, scientific field of study, scientific taxonomy
Procedia PDF Downloads 133714 Investigations at the Settlement of Oglankala
Authors: Ayten Tahirli
Abstract:
Settlements and grave monuments discovered by archeological excavations conducted in Nakhchivan Autonomous Republic have a special place in studying the Ancient history of Azerbaijan between the 4th century B.C. and the 3rd century A.C. From this point of view, the archeological excavations and investigations conducted at Oglankala, Goshatapa, Babatapa, Pusyan, Agvantapa, Meydantapa and other monuments in Nakhchivan have a specific place. From this point of view, the conclusions of archeological research conducted at the Oglankala settlement enable studying of Nakhchivan history, economic life and trade relationships broadly. Oglankala, which is located on Garatapa Mountain with a space of 50 ha, was the largest fortress in Nakhchivan and one of the largest fortresses in the South Caucasus during the Middle Iron Age. The territory where the monument is located is very important in terms of keeping Sharur Lowland, which has great importance for agriculture and is the most productive territory in Nakhchivan, where Arpachay passes starting from the Lesser Caucasus. During the excavations between 1988 and 1989 at Oglankala, covering the fortress's history belonging to the Early and Middle Iron Ages, indisputable proofs showing that the territory was an important political center were discovered at that territory. Oglankala was the capital city of an independent government during the Middle Iron Age. It maintained economic and cultural relationships with the neighboring Urartu Government and was the capital city of a city government covered by a strong protection system in the centuries after the collapse of the Achaemenid Empire. It is need say that broader archeological excavations at Oglankala City were first started by Vali Bakhshaliyev, the Department Head of the Institute of History, Ethnography and Archeology of ANAS Nakhchivan Branch. Between 1988 and 1989, V.B. Bakhshaliyev conducted an excavation within an area of 320 square meters at Oglankala. Since 2006, Oglankala has become a research object for the International Azerbaijan-USA archeological expedition. In 2006, Lauren Ristvet from Pennsylvania State University, Veli Bakhshaliyev from the Nakhchivan Branch of Azerbaijan National Academy of Sciences and Safar Ashurov from Baku Office of Azerbaijan National Academy of Sciences, together with their other colleagues and students, started to study the ancient history of that magic area. During the archeological research conducted by an international expedition between 2008 and 2011 under the supervision of Vali Bakhshaliyev, the remnants of a palace and the protective walls of a citadel constructed between late 9th century B.C. and early 8th century A.C. were discovered in that residential area. It was found out that Oglankala was the capital city of a small government established at Sharur Lowland during the Middle Iron Age and struggled against the Urartu by establishing a union with the local tribes. That government had a specific cuneiform script. Between the 4th and 2nd centuries B.C., Oglankala and the territory it covered was one of the major political centers of the Atropathena Government.Keywords: Nakhchivan, Oglankala, settlement, ceramic, archaeological excavation
Procedia PDF Downloads 78