Search results for: Business Performance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15295

Search results for: Business Performance

8425 New Approach for Constructing a Secure Biometric Database

Authors: A. Kebbeb, M. Mostefai, F. Benmerzoug, Y. Chahir

Abstract:

The multimodal biometric identification is the combination of several biometric systems. The challenge of this combination is to reduce some limitations of systems based on a single modality while significantly improving performance. In this paper, we propose a new approach to the construction and the protection of a multimodal biometric database dedicated to an identification system. We use a topological watermarking to hide the relation between face image and the registered descriptors extracted from other modalities of the same person for more secure user identification.

Keywords: biometric databases, multimodal biometrics, security authentication, digital watermarking

Procedia PDF Downloads 390
8424 Environmental Corporate Social Responsibility in Industrial Cities: A Collaborative Governance Approach

Authors: Muhlisin, Moh. Sofyan Budiarto

Abstract:

Corporate social responsibility (CSR) initiatives based on charity and philanthropy have not alleviated many sustainable environmental issues, particularly in industrial towns. The collaborative governance strategy is seen to be an option for resolving difficulties of coordination and communication between businesses, the government, and the community so that the goals of urban environmental management can be met via collaborative efforts. The purpose of this research is to identify the different forms of environmental CSR implementation by corporate entities and to create a CSR collaborative governance model in environmental management. This qualitative investigation was carried out in 2020 in Cilegon City, one of Indonesia’s industrial cities. To investigate their support, a total of 20 informants from three stakeholder groups, namely the government, corporate entities, and the community, were questioned. According to the study’s findings, cleaner production, eco-office, energy and natural resource conservation, waste management, renewable energy, climate change adaptation, and environmental education are all examples of CSR application in the environmental sector. The environmental potential of CSR implementation is to create collaborative governance. The role of business entities in providing the beginning circumstances is critical, while the government offers facilitative leadership and the CSR forum launches institutional design. These three factors are crucial to the efficiency of collaborative governance in industrial cities' environmental management.

Keywords: collaborative governance, CSR forum, environmental CSR, industrial city

Procedia PDF Downloads 87
8423 Finite Element Analysis of Shape Memory Alloy Stents in Coronary Arteries

Authors: Amatulraheem Al-Abassi, K. Khanafer, Ibrahim Deiab

Abstract:

The coronary artery stent is a promising technology that can treat various coronary diseases. Materials used for manufacturing medical stents should have high biocompatible properties. Stent alloys, in particular, are remarkably promising good clinical outcomes, however, there is threaten of restenosis (reoccurring of artery narrowing due to fatty plaque), stent recoiling, or in long-term the occurrence of stent fracture. However, stents that are made of Nickel-titanium (Nitinol) can bare extensive plastic deformation and resist restenosis. This shape memory alloy has outstanding mechanical properties. Nitinol is a unique shape memory alloy as it has unique mechanical properties such as; biocompatibility, super-elasticity, and recovery to original shape under certain loads. Stent failure may cause complications in vascular diseases and possibly blockage of blood flow. Thus, studying the behaviors of the stent under different medical conditions will help the doctors and cardiologists to predict when it is necessary to change the stent in order to prevent any severe morbidity outcomes. To the best of our knowledge, there are limited published papers that analyze the stent behavior with regards to the contact surfaces of plaque layer and blood vessel. Thus, stent material properties will be discussed in this investigation to highlight the mechanical and clinical differences between various stents. This research analyzes the performance of Nitinol stent in well-known stent design to determine its bearing with stress and its dislocation in blood vessels, in comparison to stents made of different biocompatible materials. In addition, a study of its performance will be represented in the system. Finite Element Analysis is the core of this study. Thus, a physical representative model will be discussed to show the distribution of stress and strain along the interaction surface between the stent and the artery. The reaction of vascular tissue to the stent will be evaluated to predict the possibility of restenosis within the treated area.

Keywords: shape memory alloy, stent, coronary artery, finite element analysis

Procedia PDF Downloads 203
8422 Aerodynamic Optimization of Oblique Biplane by Using Supercritical Airfoil

Authors: Asma Abdullah, Awais Khan, Reem Al-Ghumlasi, Pritam Kumari, Yasir Nawaz

Abstract:

Introduction: This study verified the potential applications of two Oblique Wing configurations that were initiated by the Germans Aerodynamicists during the WWII. Due to the end of the war, this project was not completed and in this research is targeting the revival of German Oblique biplane configuration. The research draws upon the use of two Oblique wings mounted on the top and bottom of the fuselage through a single pivot. The wings are capable of sweeping at different angles ranging from 0° at takeoff to 60° at cruising Altitude. The top wing, right half, behaves like a forward swept wing and the left half, behaves like a backward swept wing. Vice Versa applies to the lower wing. This opposite deflection of the top and lower wing cancel out the rotary moment created by each wing and the aircraft remains stable. Problem to better understand or solve: The purpose of this research is to investigate the potential of achieving improved aerodynamic performance and efficiency of flight at a wide range of sweep angles. This will help examine the most accurate value for the sweep angle at which the aircraft will possess both stability and better aerodynamics. Explaining the methods used: The Aircraft configuration is designed using Solidworks after which a series of Aerodynamic prediction are conducted, both in the subsonic and the supersonic flow regime. Computations are carried on Ansys Fluent. The results are then compared to theoretical and flight data of different Supersonic fighter aircraft of the same category (AD-1) and with the Wind tunnel testing model at subsonic speed. Results: At zero sweep angle, the aircraft has an excellent lift coefficient value with almost double that found for fighter jets. In acquiring of supersonic speed the sweep angle is increased to maximum 60 degrees depending on the mission profile. General findings: Oblique biplane can be the future fighter jet aircraft because of its high value performance in terms of aerodynamics, cost, structural design and weight.

Keywords: biplane, oblique wing, sweep angle, supercritical airfoil

Procedia PDF Downloads 278
8421 European Drug Serialization: Securing the Pharmaceutical Drug Supply Chain from Counterfeiters

Authors: Vikram Chowdhary, Marek Vins

Abstract:

The profitability of the pharmaceutical drug business has attracted considerable interest, but it also faces significant challenges. Counterfeiters take advantage of the industry's vulnerabilities, which are further exacerbated by the globalization of the market, online trading, and complex supply chains. Governments and organizations worldwide are dedicated to creating a secure environment that ensures a consistent and genuine supply of pharmaceutical products. In 2019, the European authorities implemented regulation EU 2016/161 to strengthen traceability and transparency throughout the entire drug supply chain. This regulation requires the addition of enhanced security features, such as serializing items to the saleable unit level or individual packs. Despite these efforts, the incidents of pharmaceutical counterfeiting continue to rise globally, with regulated territories being particularly affected. This paper examines the effectiveness of the drug serialization system implemented by European authorities. By conducting a systematic literature review, we assess the implementation of drug serialization and explore the potential benefits of integrating emerging digital technologies, such as RFID and Blockchain, to improve traceability and management. The objective is to fortify pharmaceutical supply chains against counterfeiters and manipulators and ensure their security.

Keywords: blockchain, counterfeit drugs, EU drug serialization, pharmaceutical industry, RFID

Procedia PDF Downloads 110
8420 The Strategy of Urban Traditional Consumer Areas Adapting to Digital Logistics: A Case Study of Fengying Xili in Changsha

Authors: Mengjie Zhou

Abstract:

Under the rapid promotion of digital logistics, the old consumption space in cities is undergoing profound transformation and reconstruction. This article systematically analyzes the impact of digital logistics on existing consumer spaces in cities and how these spaces can adapt to and lead this change through distinct ‘spatial production’ models. The digital transformation of the logistics industry has significantly improved logistics efficiency and service quality while also putting forward new requirements for the form and function of consumer space. In this process, the old consumption space in cities not only faces the trend of material consumption transforming into spiritual consumption but also needs to face profound changes in consumer behavior patterns. Taking Fengying Xili in Changsha as an empirical case, this article explores in detail how it successfully transformed from a traditional consumption space to a modern cultural consumption space by introducing new business formats, optimizing spatial layout, and improving service quality while preserving its historical heritage. This case not only provides valuable practical experience for the transformation of old urban consumption spaces but also demonstrates the feasibility and potential of the new model of ‘spatial production’.

Keywords: digital logistics, urban consumption space, space production, urban renewal

Procedia PDF Downloads 41
8419 Online Language Learning and Teaching Pedagogy: Constructivism and Beyond

Authors: Zeineb Deymi-Gheriani

Abstract:

In the last two decades, one can clearly observe a boom of interest for e-learning and web-supported programs. However, one can also notice that many of these programs focus on the accumulation and delivery of content generally as a business industry with no much concern for theoretical underpinnings. The existing research, at least in online English language teaching (ELT), has demonstrated a lack of an effective online teaching pedagogy anchored in a well-defined theoretical framework. Hence, this paper comes as an attempt to present constructivism as one of the theoretical bases for the design of an effective online language teaching pedagogy which is at the same time technologically intelligent and theoretically informed to help envision how education can best take advantage of the information and communication technology (ICT) tools. The present paper discusses the key principles underlying constructivism, its implications for online language teaching design, as well as its limitations that should be avoided in the e-learning instructional design. Although the paper is theoretical in nature, essentially based on an extensive literature survey on constructivism, it does have practical illustrations from an action research conducted by the author both as an e-tutor of English using Moodle online educational platform at the Virtual University of Tunis (VUT) from 2007 up to 2010 and as a face-to-face (F2F) English teaching practitioner in the Professional Certificate of English Language Teaching Training (PCELT) at AMIDEAST, Tunisia (April-May, 2013).

Keywords: active learning, constructivism, experiential learning, Piaget, Vygotsky

Procedia PDF Downloads 351
8418 Improving Fluid Catalytic Cracking Unit Performance through Low Cost Debottlenecking

Authors: Saidulu Gadari, Manoj Kumar Yadav, V. K. Satheesh, Debasis Bhattacharyya, S. S. V. Ramakumar, Subhajit Sarkar

Abstract:

Most Fluid Catalytic Cracking Units (FCCUs) are big profit makers and hence, always operated with several constraints. It is the primary source for production of gasoline, light olefins as petrochemical feedstocks, feedstock for alkylate & oxygenates, LPG, etc. in a refinery. Increasing unit capacity and improving product yields as well as qualities such as gasoline RON have dramatic impact on the refinery economics. FCCUs are often debottlenecked significantly beyond their original design capacities. Depending upon the unit configuration, operating conditions, and feedstock quality, the FCC unit can have a variety of bottlenecks. While some of these are aimed to increase the feed rate, improve the conversion, etc., the others are aimed to improve the reliability of the equipment or overall unit. Apart from investment cost, the other factors considered generally while evaluating the debottlenecking options are shutdown days, faster payback, risk on investment, etc. A low-cost solution such as replacement of feed injectors, air distributor, steam distributors, spent catalyst distributor, efficient cyclone system, etc. are the preferred way of upgrading FCCU. It also has lower lead time from idea inception to implementation. This paper discusses various bottlenecks generally encountered in FCCU and presents a case study on improvement of performance of one of the FCCUs in IndianOil through implementation of cost-effective technical solution including use of improved internals in Reactor-Regeneration (R-R) section. After implementation reduction in regenerator air, gas superficial velocity in regenerator and cyclone velocities by about 10% and improvement of CLO yield from 10 to 6 wt% have been achieved. By ensuring proper pressure balance and optimum immersion of cyclone dipleg in the standpipe, frequent formation of perforations in regenerator cyclones could be addressed which in turn improved the unit on-stream factor.

Keywords: FCC, low-cost, revamp, debottleneck, internals, distributors, cyclone, dipleg

Procedia PDF Downloads 215
8417 Unmet English Needs of the Non-Engineering Staff: The Case of Algerian Hydrocarbon Industry

Authors: N. Khiati

Abstract:

The present paper attempts to report on some findings that emerged out of a larger scale doctorate research into English language needs of a renowned Algerian company of Hydrocarbon industry. From a multifaceted English for specific purposes (ESP) research perspective, the paper considers the English needs of the finance/legal department staff in the midst of the conflicting needs perspectives involving both objective needs indicators (i.e., the pressure of globalised business) and the general negative attitudes among the administrative -mainly jurists- staff towards English (favouring a non-adaptation strategy). The researcher’s unearthing of the latter’s needs is an endeavour to concretise the concepts of unmet, or unconscious needs, among others. This is why, these initially uncovered hidden needs will be detailed questioning educational background, namely previous language of instruction; training experiences and expectations; as well as the actual communicative practices derived from the retrospective interviews and preliminary quantitative data of the questionnaire. Based on these rough clues suggesting real needs, the researcher will tentatively propose some implications for both pre-service and in-service training organisers as well as for educational policy makers in favour of an English course in legal English for the jurists mainly from pre-graduate phases to in-service training.

Keywords: English for specific purposes (ESP), legal and finance staff, needs analysis, unmet/unconscious needs, training implications

Procedia PDF Downloads 147
8416 Review of the Legislative and Policy Issues in Promoting Infrastructure Development to Promote Automation in Telecom Industry

Authors: Marvin Ricardo Awarab

Abstract:

There has never been a greater need for telecom services. The Internet of Things (IoT), 5G networking, and edge computing are the driving forces behind this increased demand. The fierce demand offers communications service providers significant income opportunities. The telecom sector is centered on automation, and realizing a digital operation that functions as a real-time business will be crucial for the industry as a whole. Automation in telecom refers to the application of technology to create a more effective, quick, and scalable alternative to the conventional method of operating the telecom industry. With the promotion of 5G and the Internet of Things (IoT), telecom companies will continue to invest extensively in telecom automation technology. Automation offers benefits in the telecom industry; developing countries such as Namibia may not fully tap into such benefits because of the lack of funds and infrastructural resources to invest in automation. This paper fully investigates the benefits of automation in the telecom industry. Furthermore, the paper identifies hiccups that developing countries such as Namibia face in their quest to fully introduce automation in the telecom industry. Additionally, the paper proposes possible avenues that Namibia, as a developing country, adopt investing in automation infrastructural resources with the aim of reaping the full benefits of automation in the telecom industry.

Keywords: automation, development, internet, internet of things, network, telecom, telecommunications policy, 5G

Procedia PDF Downloads 63
8415 Wireless FPGA-Based Motion Controller Design by Implementing 3-Axis Linear Trajectory

Authors: Kiana Zeighami, Morteza Ozlati Moghadam

Abstract:

Designing a high accuracy and high precision motion controller is one of the important issues in today’s industry. There are effective solutions available in the industry but the real-time performance, smoothness and accuracy of the movement can be further improved. This paper discusses a complete solution to carry out the movement of three stepper motors in three dimensions. The objective is to provide a method to design a fully integrated System-on-Chip (SOC)-based motion controller to reduce the cost and complexity of production by incorporating Field Programmable Gate Array (FPGA) into the design. In the proposed method the FPGA receives its commands from a host computer via wireless internet communication and calculates the motion trajectory for three axes. A profile generator module is designed to realize the interpolation algorithm by translating the position data to the real-time pulses. This paper discusses an approach to implement the linear interpolation algorithm, since it is one of the fundamentals of robots’ movements and it is highly applicable in motion control industries. Along with full profile trajectory, the triangular drive is implemented to eliminate the existence of error at small distances. To integrate the parallelism and real-time performance of FPGA with the power of Central Processing Unit (CPU) in executing complex and sequential algorithms, the NIOS II soft-core processor was added into the design. This paper presents different operating modes such as absolute, relative positioning, reset and velocity modes to fulfill the user requirements. The proposed approach was evaluated by designing a custom-made FPGA board along with a mechanical structure. As a result, a precise and smooth movement of stepper motors was observed which proved the effectiveness of this approach.

Keywords: 3-axis linear interpolation, FPGA, motion controller, micro-stepping

Procedia PDF Downloads 208
8414 PLO-AIM: Potential-Based Lane Organization in Autonomous Intersection Management

Authors: Berk Ecer, Ebru Akcapinar Sezer

Abstract:

Traditional management models of intersections, such as no-light intersections or signalized intersection, are not the most effective way of passing the intersections if the vehicles are intelligent. To this end, Dresner and Stone proposed a new intersection control model called Autonomous Intersection Management (AIM). In the AIM simulation, they were examining the problem from a multi-agent perspective, demonstrating that intelligent intersection control can be made more efficient than existing control mechanisms. In this study, autonomous intersection management has been investigated. We extended their works and added a potential-based lane organization layer. In order to distribute vehicles evenly to each lane, this layer triggers vehicles to analyze near lanes, and they change their lane if other lanes have an advantage. We can observe this behavior in real life, such as drivers, change their lane by considering their intuitions. Basic intuition on selecting the correct lane for traffic is selecting a less crowded lane in order to reduce delay. We model that behavior without any change in the AIM workflow. Experiment results show us that intersection performance is directly connected with the vehicle distribution in lanes of roads of intersections. We see the advantage of handling lane management with a potential approach in performance metrics such as average delay of intersection and average travel time. Therefore, lane management and intersection management are problems that need to be handled together. This study shows us that the lane through which vehicles enter the intersection is an effective parameter for intersection management. Our study draws attention to this parameter and suggested a solution for it. We observed that the regulation of AIM inputs, which are vehicles in lanes, was as effective as contributing to aim intersection management. PLO-AIM model outperforms AIM in evaluation metrics such as average delay of intersection and average travel time for reasonable traffic rates, which is in between 600 vehicle/hour per lane to 1300 vehicle/hour per lane. The proposed model reduced the average travel time reduced in between %0.2 - %17.3 and reduced the average delay of intersection in between %1.6 - %17.1 for 4-lane and 6-lane scenarios.

Keywords: AIM project, autonomous intersection management, lane organization, potential-based approach

Procedia PDF Downloads 139
8413 Plant Layout Analysis by Computer Simulation for Electronic Manufacturing Service Plant

Authors: D. Visuwan, B. Phruksaphanrat

Abstract:

In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyse and design the possible cellular layouts for the factory. The cellular layout was selected based on the main criteria of the plant. Computer simulation was used to analyse and compare the performance of the proposed cellular layout and the current layout. It is found that the proposed cellular layout can generate better performances than the current layout. In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyse and design the possible cellular layouts for the factory. The cellular layout was selected based on the main criteria of the plant. Computer simulation was used to analyse and compare the performance of the proposed cellular layout and the current layout. It found that the proposed cellular layout can generate better performances than the current layout.

Keywords: layout, electronic manufacturing service plant, computer simulation, cellular manufacturing system

Procedia PDF Downloads 305
8412 Linkages between Postponement Strategies and Flexibility in Organizations

Authors: Polycarpe Feussi

Abstract:

Globalization, technological and customer increasing changes, amongst other drivers, result in higher levels of uncertainty and unpredictability for organizations. In order for organizations to cope with the uncertain and fast-changing economic and business environment, these organizations need to innovate in order to achieve flexibility. In simple terms, the organizations must develop strategies leading to the ability of these organizations to provide horizontal information connections across the supply chain to create and deliver products that meet customer needs by synchronization of customer demands with product creation. The generated information will create efficiency and effectiveness throughout the whole supply chain regarding production, storage, and distribution, as well as eliminating redundant activities and reduction in response time. In an integrated supply chain, spanning activities include coordination with distributors and suppliers. This paper explains how through postponement strategies, flexibility can be achieved in an organization. In order to achieve the above, a thorough literature review was conducted via the search of online websites that contains material from scientific journal data-bases, articles, and textbooks on the subject of postponement and flexibility. The findings of the research are found in the last part of the paper. The first part introduces the concept of postponement and its importance in supply chain management. The second part of the paper provides the methodology used in the process of writing the paper.

Keywords: postponement strategies, supply chain management, flexibility, logistics

Procedia PDF Downloads 193
8411 Measuring the Biomechanical Effects of Worker Skill Level and Joystick Crane Speed on Forestry Harvesting Performance Using a Simulator

Authors: Victoria L. Chester, Usha Kuruganti

Abstract:

The forest industry is a major economic sector of Canada and also one of the most dangerous industries for workers. The use of mechanized mobile forestry harvesting machines has successfully reduced the incidence of injuries in forest workers related to manual labor. However, these machines have also created additional concerns, including a high machine operation learning curve, increased the length of the workday, repetitive strain injury, cognitive load, physical and mental fatigue, and increased postural loads due to sitting in a confined space. It is critical to obtain objective performance data for employers to develop appropriate work practices for this industry, however ergonomic field studies of this industry are lacking mainly due to the difficulties in obtaining comprehensive data while operators are cutting trees in the woods. The purpose of this study was to establish a measurement and experimental protocol to examine the effects of worker skill level and movement training speed (joystick crane speed) on harvesting performance using a forestry simulator. A custom wrist angle measurement device was developed as part of the study to monitor Euler angles during operation of the simulator. The device of the system consisted of two accelerometers, a Bluetooth module, three 3V coin cells, a microcontroller, a voltage regulator and an application software. Harvesting performance and crane data was provided by the simulator software and included tree to frame collisions, crane to tree collisions, boom tip distance, number of trees cut, etc. A pilot study of 3 operators with various skill levels was tested to identify factors that distinguish highly skilled operators from novice or intermediate operators. Dependent variables such as reaction time, math skill, past work experience, training movement speed (e.g. joystick control speeds), harvesting experience level, muscle activity, and wrist biomechanics were measured and analyzed. A 10-channel wireless surface EMG system was used to monitor the amplitude and mean frequency of 10 upper extremity muscles during pre and postperformance on the forestry harvest stimulator. The results of the pilot study showed inconsistent changes in median frequency pre-and postoperation, but there was the increase in the activity of the flexor carpi radialis, anterior deltoid and upper trapezius of both arms. The wrist sensor results indicated that wrist supination and pronation occurred more than flexion and extension with radial-ulnar rotation demonstrating the least movement. Overall, wrist angular motion increased as the crane speed increased from slow to fast. Further data collection is needed and will help industry partners determine those factors that separate skill levels of operators, identify optimal training speeds, and determine the length of training required to bring new operators to an efficient skill level effectively. In addition to effective and employment training programs, results of this work will be used for selective employee recruitment strategies to improve employee retention after training. Further, improved training procedures and knowledge of the physical and mental demands on workers will lead to highly trained and efficient personnel, reduced risk of injury, and optimal work protocols.

Keywords: EMG, forestry, human factors, wrist biomechanics

Procedia PDF Downloads 145
8410 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management

Authors: Chokri Slim

Abstract:

The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.

Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines

Procedia PDF Downloads 150
8409 Understanding the Productivity Effect on Industrial Management: The Portuguese Wood Furniture Industry Case Study

Authors: Jonas A. R. H. Lima, Maria Antonia Carravilla

Abstract:

As productivity concepts are widely related to industrial savings, it is becoming particularly important in a more and more competitive world, to really understand how productivity can be well used in industrial management techniques. Nowadays, consumers are no more willing to pay for mistakes and inefficiencies. Therefore, one way for companies to stay competitive is to control and increase their productivity. This study aims to define clearly the productivity concept, understand how a company can affect productivity, and, if possible, identify the relation between each identified productivity factor. This will help managers, by clarifying the main issues behind productivity concepts and proposing a methodology to measure, control and increase productivity. The main questions to be answered are: what is the importance of productivity for the Portuguese Wood Furniture Industry? Is it possible to control productivity internally, or is it a phenomenon external to companies, hard or even impossible to control? How to understand, control and adjust productivity performance? How to make productivity to become one main asset for maximizing the use of the available resources? This essay will follow a constructive approach mostly based in the research hypothesis mentioned above. For that, a literature review is being done to find the main conceptual frameworks and empirical studies that already exist, and by doing so, highlight eventual knowledge or conflicting research to be addressed in this work. We expect to build theoretical explanations and test theoretical predictions from participants understandings and own experiences, by elaborating field surveys and interviews, to select adjusted productivity indicators and analyze the productivity evolution according the adjustments on other variables. Its intended the conduction of an exploratory work that can simultaneous clarify productivity concepts, objectives, and define frameworks. This investigation intends to migrate from merely academic concepts to a daily basis operational reality of the companies from the Portuguese Wood Furniture Industry highlighting productivity increased importance within modern engineering and industrial management. The ambition is to clarify, systemize and develop a management tool that may not only control but positively influence the way resources are used.

Keywords: industrial management, motivation, productivity, performance indicators, reward management, wood furniture industry

Procedia PDF Downloads 229
8408 Sparse Representation Based Spatiotemporal Fusion Employing Additional Image Pairs to Improve Dictionary Training

Authors: Dacheng Li, Bo Huang, Qinjin Han, Ming Li

Abstract:

Remotely sensed imagery with the high spatial and temporal characteristics, which it is hard to acquire under the current land observation satellites, has been considered as a key factor for monitoring environmental changes over both global and local scales. On a basis of the limited high spatial-resolution observations, challenged studies called spatiotemporal fusion have been developed for generating high spatiotemporal images through employing other auxiliary low spatial-resolution data while with high-frequency observations. However, a majority of spatiotemporal fusion approaches yield to satisfactory assumption, empirical but unstable parameters, low accuracy or inefficient performance. Although the spatiotemporal fusion methodology via sparse representation theory has advantage in capturing reflectance changes, stability and execution efficiency (even more efficient when overcomplete dictionaries have been pre-trained), the retrieval of high-accuracy dictionary and its response to fusion results are still pending issues. In this paper, we employ additional image pairs (here each image-pair includes a Landsat Operational Land Imager and a Moderate Resolution Imaging Spectroradiometer acquisitions covering the partial area of Baotou, China) only into the coupled dictionary training process based on K-SVD (K-means Singular Value Decomposition) algorithm, and attempt to improve the fusion results of two existing sparse representation based fusion models (respectively utilizing one and two available image-pair). The results show that more eligible image pairs are probably related to a more accurate overcomplete dictionary, which generally indicates a better image representation, and is then contribute to an effective fusion performance in case that the added image-pair has similar seasonal aspects and image spatial structure features to the original image-pair. It is, therefore, reasonable to construct multi-dictionary training pattern for generating a series of high spatial resolution images based on limited acquisitions.

Keywords: spatiotemporal fusion, sparse representation, K-SVD algorithm, dictionary learning

Procedia PDF Downloads 260
8407 Effect of Different Factors on Temperature Profile and Performance of an Air Bubbling Fluidized Bed Gasifier for Rice Husk Gasification

Authors: Dharminder Singh, Sanjeev Yadav, Pravakar Mohanty

Abstract:

In this work, study of temperature profile in a pilot scale air bubbling fluidized bed (ABFB) gasifier for rice husk gasification was carried out. Effects of different factors such as multiple cyclones, gas cooling system, ventilate gas pipe length, and catalyst on temperature profile was examined. ABFB gasifier used in this study had two sections, one is bed section and the other is freeboard section. River sand was used as bed material with air as gasification agent, and conventional charcoal as start-up heating medium in this gasifier. Temperature of different point in both sections of ABFB gasifier was recorded at different ER value and ER value was changed by changing the feed rate of biomass (rice husk) and by keeping the air flow rate constant for long durational of gasifier operation. ABFB with double cyclone with gas coolant system and with short length ventilate gas pipe was found out to be optimal gasifier design to give temperature profile required for high gasification performance in long duration operation. This optimal design was tested with different ER values and it was found that ER of 0.33 was most favourable for long duration operation (8 hr continuous operation), giving highest carbon conversion efficiency. At optimal ER of 0.33, bed temperature was found to be stable at 700 °C, above bed temperature was found to be at 628.63 °C, bottom of freeboard temperature was found to be at 600 °C, top of freeboard temperature was found to be at 517.5 °C, gas temperature was found to be at 195 °C, and flame temperature was found to be 676 °C. Temperature at all the points showed fluctuations of 10 – 20 °C. Effect of catalyst i.e. dolomite (20% with sand bed) was also examined on temperature profile, and it was found that at optimal ER of 0.33, the bed temperature got increased to 795 °C, above bed temperature got decreased to 523 °C, bottom of freeboard temperature got decreased to 548 °C, top of freeboard got decreased to 475 °C, gas temperature got decreased to 220 °C, and flame temperature got increased to 703 °C. Increase in bed temperature leads to higher flame temperature due to presence of more hydrocarbons generated from more tar cracking at higher temperature. It was also found that the use of dolomite with sand bed eliminated the agglomeration in the reactor at such high bed temperature (795 °C).

Keywords: air bubbling fluidized bed gasifier, bed temperature, charcoal heating, dolomite, flame temperature, rice husk

Procedia PDF Downloads 278
8406 A Parallel Implementation of k-Means in MATLAB

Authors: Dimitris Varsamis, Christos Talagkozis, Alkiviadis Tsimpiris, Paris Mastorocostas

Abstract:

The aim of this work is the parallel implementation of k-means in MATLAB, in order to reduce the execution time. Specifically, a new function in MATLAB for serial k-means algorithm is developed, which meets all the requirements for the conversion to a function in MATLAB with parallel computations. Additionally, two different variants for the definition of initial values are presented. In the sequel, the parallel approach is presented. Finally, the performance tests for the computation times respect to the numbers of features and classes are illustrated.

Keywords: K-means algorithm, clustering, parallel computations, Matlab

Procedia PDF Downloads 385
8405 Decision Making Communication in the Process of Technologies Commercialization: Archival Analysis of the Process Content

Authors: Vaida Zemlickiene

Abstract:

Scientists around the world and practitioners are working to identify the factors that influence the results of technology commercialization and to propose the ideal model for the technology commercialization process. In other words, all stakeholders of technology commercialization seek to find a formula or set of rules to succeed in commercializing technologies in order to avoid unproductive investments. In this article, the process of commercialization technology is understood as the process of transforming inventions into marketable products, services, and processes, or the path from the idea of using an invention to a product that incorporates process from 1 to 9 technology readiness level (TRL). There are many publications in the field of management literature, which are aimed at managing the commercialization process. However, there is an apparent lack of research for communication in decision-making in the process of technology commercialization. Works were done in the past, and the last decade's global research analysis led to the unambiguous conclusion that the methodological framework is not mature enough to be of practical use in business. The process of technology commercialization and the decisions made in the process should be explored in-depth. An archival analysis is performed to find insights into decision-making communication in the process of technologies commercialization, to find out the content of technology commercialization process: decision-making stages and participants, to analyze the internal factors of technology commercialization, to perform their critical analysis, to analyze the concept of successful/unsuccessful technology commercialization.

Keywords: the process of technology commercialization, communication in decision-making process, the content of technology commercialization process, successful/unsuccessful technology commercialization

Procedia PDF Downloads 153
8404 Trip Reduction in Turbo Machinery

Authors: Pranay Mathur, Carlo Michelassi, Simi Karatha, Gilda Pedoto

Abstract:

Industrial plant uptime is top most importance for reliable, profitable & sustainable operation. Trip and failed start has major impact on plant reliability and all plant operators focussed on efforts required to minimise the trips & failed starts. The performance of these CTQs are measured with 2 metrics, MTBT(Mean time between trips) and SR (Starting reliability). These metrics helps to identify top failure modes and identify units need more effort to improve plant reliability. Baker Hughes Trip reduction program structured to reduce these unwanted trip 1. Real time machine operational parameters remotely available and capturing the signature of malfunction including related boundary condition. 2. Real time alerting system based on analytics available remotely. 3. Remote access to trip logs and alarms from control system to identify the cause of events. 4. Continuous support to field engineers by remotely connecting with subject matter expert. 5. Live tracking of key CTQs 6. Benchmark against fleet 7. Break down to the cause of failure to component level 8. Investigate top contributor, identify design and operational root cause 9. Implement corrective and preventive action 10. Assessing effectiveness of implemented solution using reliability growth models. 11. Develop analytics for predictive maintenance With this approach , Baker Hughes team is able to support customer in achieving their Reliability Key performance Indicators for monitored units, huge cost savings for plant operators. This Presentation explains these approach while providing successful case studies, in particular where 12nos. of LNG and Pipeline operators with about 140 gas compressing line-ups has adopted these techniques and significantly reduce the number of trips and improved MTBT

Keywords: reliability, availability, sustainability, digital infrastructure, weibull, effectiveness, automation, trips, fail start

Procedia PDF Downloads 76
8403 Discipline-Specific Culture: A Purpose-Based Investigation

Authors: Sihem Benaouda

Abstract:

English is gaining an international identity as it affects every academic and professional field in the world. Without increasing their cultural understanding, it would obviously be difficult to completely educate learners for communication in a globalised environment. The concept of culture is intricate and needs to be elucidated, especially in an English language teaching (ELT) context. The study focuses on the investigation of the cultural studies integrated into the different types of English for specific purposes (ESP) materials, as opposed to English for general purposes (EGP) textbooks. A qualitative methodology based on a triangulation of techniques was conducted through materials analysis of five textbooks in both advanced EGP and three types of ESP. In addition to a semi-structured interview conducted with Algerian ESP practitioners, data analysis results revealed that culture in ESP textbooks is not overtly isolated into chapters and that cultural studies are predominantly present in business and economics materials, namely English for hotel and catering staff, tourism, and flight attendants. However, implicit cultural instruction is signalled in the social sciences and is negligible in science and technology sources. In terms of content, cultural studies in EGP are more related to generic topics, whereas, in some ESP materials, the topics are rather oriented to the specific field they belong to. Furthermore, the respondents’ answers showed an unawareness of the importance of culture in ESP teaching, besides some disregard for culture teaching per se in ESP contexts.

Keywords: ESP, EGP, cultural studies, textbooks, teaching, materials

Procedia PDF Downloads 108
8402 Investigation of Contact Pressure Distribution at Expanded Polystyrene Geofoam Interfaces Using Tactile Sensors

Authors: Chen Liu, Dawit Negussey

Abstract:

EPS (Expanded Polystyrene) geofoam as light-weight material in geotechnical applications are made of pre-expanded resin beads that form fused cellular micro-structures. The strength and deformation properties of geofoam blocks are determined by unconfined compression of small test samples between rigid loading plates. Applied loads are presumed to be supported uniformly over the entire mating end areas. Predictions of field performance on the basis of such laboratory tests widely over-estimate actual post-construction settlements and exaggerate predictions of long-term creep deformations. This investigation examined the development of contact pressures at a large number of discrete points at low and large strain levels for different densities of geofoam. Development of pressure patterns for fine and coarse interface material textures as well as for molding skin and hot wire cut geofoam surfaces were examined. The lab testing showed that I-Scan tactile sensors are useful for detailed observation of contact pressures at a large number of discrete points simultaneously. At low strain level (1%), the lower density EPS block presents low variations in localized stress distribution compared to higher density EPS. At high strain level (10%), the dense geofoam reached the sensor cut-off limit. The imprint and pressure patterns for different interface textures can be distinguished with tactile sensing. The pressure sensing system can be used in many fields with real-time pressure detection. The research findings provide a better understanding of EPS geofoam behavior for improvement of design methods and performance prediction of critical infrastructures, which will be anticipated to guide future improvements in design and rapid construction of critical transportation infrastructures with geofoam in geotechnical applications.

Keywords: geofoam, pressure distribution, tactile pressure sensors, interface

Procedia PDF Downloads 173
8401 Robustness of the Deep Chroma Extractor and Locally-Normalized Quarter Tone Filters in Automatic Chord Estimation under Reverberant Conditions

Authors: Luis Alvarado, Victor Poblete, Isaac Gonzalez, Yetzabeth Gonzalez

Abstract:

In MIREX 2016 (http://www.music-ir.org/mirex), the deep neural network (DNN)-Deep Chroma Extractor, proposed by Korzeniowski and Wiedmer, reached the highest score in an audio chord recognition task. In the present paper, this tool is assessed under acoustic reverberant environments and distinct source-microphone distances. The evaluation dataset comprises The Beatles and Queen datasets. These datasets are sequentially re-recorded with a single microphone in a real reverberant chamber at four reverberation times (0 -anechoic-, 1, 2, and 3 s, approximately), as well as four source-microphone distances (32, 64, 128, and 256 cm). It is expected that the performance of the trained DNN will dramatically decrease under these acoustic conditions with signals degraded by room reverberation and distance to the source. Recently, the effect of the bio-inspired Locally-Normalized Cepstral Coefficients (LNCC), has been assessed in a text independent speaker verification task using speech signals degraded by additive noise at different signal-to-noise ratios with variations of recording distance, and it has also been assessed under reverberant conditions with variations of recording distance. LNCC showed a performance so high as the state-of-the-art Mel Frequency Cepstral Coefficient filters. Based on these results, this paper proposes a variation of locally-normalized triangular filters called Locally-Normalized Quarter Tone (LNQT) filters. By using the LNQT spectrogram, robustness improvements of the trained Deep Chroma Extractor are expected, compared with classical triangular filters, and thus compensating the music signal degradation improving the accuracy of the chord recognition system.

Keywords: chord recognition, deep neural networks, feature extraction, music information retrieval

Procedia PDF Downloads 232
8400 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals

Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor

Abstract:

This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.

Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers

Procedia PDF Downloads 75
8399 Sustainable Design through up-Cycling Crafts in the Mainstream Fashion Industry of India

Authors: Avani Chhajlani

Abstract:

Fashion is considered to be the most destructive industry, second only to the oil rigging industry, which has a greater impact on the environment. While fashion today banks upon fast fashion to generate a higher turnover of designs and patterns in apparel and related accessories, crafts push us towards a more slow and thoughtful approach with culturally identifiably unique work and slow community-centered production. Despite this strong link between indigenous crafts and sustainability, it has not been extensively researched and explored upon. In the forthcoming years, the fashion industry will have to reinvent itself to move towards a more holistic and sustainable circular model to balance the harm already caused. And closed loops of the circular economy will help the integration of indigenous craft knowledge, which is regenerative. Though sustainability and crafts of a region go hand-in-hand, the craft still have to find its standing in the mainstream fashion world; craft practices have a strong local congruence and knowledge that has been passed down generation-to-generation through oration or written materials. This paper aims to explore ways a circular economy can be created by amalgamating fashion and craft while creating a sustainable business model and how this is slowly being created today through brands like – RaasLeela, Pero, and KaSha, to name a few.

Keywords: circular economy, fashion, India, indigenous crafts, slow fashion, sustainability, up-cycling

Procedia PDF Downloads 187
8398 Enhancing the Flotation of Fine and Ultrafine Pyrite Particles Using Electrolytically Generated Bubbles

Authors: Bogale Tadesse, Krutik Parikh, Ndagha Mkandawire, Boris Albijanic, Nimal Subasinghe

Abstract:

It is well established that the floatability and selectivity of mineral particles are highly dependent on the particle size. Generally, a particle size of 10 micron is considered as the critical size below which both flotation selectivity and recovery decline sharply. It is widely accepted that the majority of ultrafine particles, including highly liberated valuable minerals, will be lost in tailings during a conventional flotation process. This is highly undesirable particularly in the processing of finely disseminated complex and refractory ores where there is a requirement for fine grinding in order to liberate the valuable minerals. In addition, the continuing decline in ore grade worldwide necessitates intensive processing of low grade mineral deposits. Recent advances in comminution allow the economic grinding of particles down to 10 micron sizes to enhance the probability of liberating locked minerals from low grade ores. Thus, it is timely that the flotation of fine and ultrafine particles is improved in order to reduce the amount of valuable minerals lost as slimes. It is believed that the use of fine bubbles in flotation increases the bubble-particle collision efficiency and hence the flotation performance. Electroflotation, where bubbles are generated by the electrolytic breakdown of water to produce oxygen and hydrogen gases, leads to the formation of extremely finely dispersed gas bubbles with dimensions varying from 5 to 95 micron. The sizes of bubbles generated by this method are significantly smaller than those found in conventional flotation (> 600 micron). In this study, microbubbles generated by electrolysis of water were injected into a bench top flotation cell to assess the performance electroflotation in enhancing the flotation of fine and ultrafine pyrite particles of sizes ranging from 5 to 53 micron. The design of the cell and the results from optimization of the process variables such as current density, pH, percent solid and particle size will be presented at this conference.

Keywords: electroflotation, fine bubbles, pyrite, ultrafine particles

Procedia PDF Downloads 335
8397 Comparison of Shell-Facemask Responses in American Football Helmets during NOCSAE Drop Tests

Authors: G. Alston Rush, Gus A. Rush III, M. F. Horstemeyer

Abstract:

This study compares the shell-facemask responses of four commonly used American football helmets, under the National Operating Committee on Standards for Athletic Equipment (NOCSAE) drop impact test method, to show that the test standard would more accurately simulate in-use conditions by modification to include the facemask. In our study, the need for a more vigorous systematic approach to football helmet testing procedures is emphasized by comparing the Head Injury Criterion (HIC), the Gadd Severity Index (SI), and peak acceleration values for different helmets at different locations on the helmet under modified NOCSAE standard drop tower tests. Drop tests were performed on the Rawlings Quantum Plus, Riddell 360, Schutt Ion 4D, and Xenith X2 helmets at eight impact locations, impact velocities of 5.46 and 4.88 meters per second, and helmet configurations with and without facemasks. Analysis of NOCSAE drop test results reveal significant differences (p < 0.05) for when the facemasks were attached to helmets, as compared to the NOCSAE Standard, without facemask configuration. The boundary conditions of the facemask attachment can have up to a 50% decrease (p < 0.001) in helmet performance with respect to peak acceleration. While generally, all helmets with the facemasks gave greater HIC, SI, and acceleration values than helmets without the facemasks, significant helmet dependent variations were observed across impact locations and impact velocities. The variations between helmet responses could be attributed to the unique design features of each helmet tested, which include different liners, chin strap attachments, and faceguard attachment systems. In summary, these comparative drop test results revealed that the current NOCSAE standard test methods need improvement by attaching the facemasks to helmets during testing. The modified NOCSAE football helmet standard test gives a more accurate representation of a helmet’s performance and its ability to mitigate the on-field impact.

Keywords: football helmet testing, gadd severity index, head injury criterion, mild traumatic brain injury

Procedia PDF Downloads 447
8396 Deconstructing Abraham Maslow’s Hierarchy of Needs: A Comparison of Organizational Behaviour and Branding Perspectives

Authors: Satya Girish Goparaju

Abstract:

It is said that the pyramid of Needs is not an invention by Maslow but only a graphical representation of his theory. It is also interesting to note how business management schools have adopted this interpreted theory to organizational behavior and marketing subjects. Against this background, this article attempts to raise the point that the hierarchy of needs proposed by Abraham Maslow need not necessarily be represented in a pyramid, but a linear model would be more eligible in the present times. To propose this point, this article presents needs a comparative study of ‘self-actualization’ (the apex of the pyramid) in organizational behavior and branding contexts, respectively. This article tries to shed light on the original theory proposed by Maslow, which stated that self-actualization is attained through living one’s life completely and not by satisfying individual needs. Therefore, in an organizational behavior perspective, it can be understood that self-actualization is irrelevant as an employee’s life is not the work and the satisfied needs in a workplace will only make the employee perform better. In the same way, a brand does not sell products to satisfy all needs of a consumer and does not have a role directly in attaining self-actualization. For the purpose of this study, select employees of a branding agency will participate in responding to a questionnaire to answer both as employees of an organization and also as consumers of a global smartphone brand. This study aims to deconstruct the interpretations that have been widely accepted by both organizational behavior and branding professionals.

Keywords: branding, marketing, needs, organizational behavior, psychology

Procedia PDF Downloads 230