Search results for: technological process
15856 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 33415855 Wait-Optimized Scheduler Algorithm for Efficient Process Scheduling in Computer Systems
Authors: Md Habibur Rahman, Jaeho Kim
Abstract:
Efficient process scheduling is a crucial factor in ensuring optimal system performance and resource utilization in computer systems. While various algorithms have been proposed over the years, there are still limitations to their effectiveness. This paper introduces a new Wait-Optimized Scheduler (WOS) algorithm that aims to minimize process waiting time by dividing them into two layers and considering both process time and waiting time. The WOS algorithm is non-preemptive and prioritizes processes with the shortest WOS. In the first layer, each process runs for a predetermined duration, and any unfinished process is subsequently moved to the second layer, resulting in a decrease in response time. Whenever the first layer is free or the number of processes in the second layer is twice that of the first layer, the algorithm sorts all the processes in the second layer based on their remaining time minus waiting time and sends one process to the first layer to run. This ensures that all processes eventually run, optimizing waiting time. To evaluate the performance of the WOS algorithm, we conducted experiments comparing its performance with traditional scheduling algorithms such as First-Come-First-Serve (FCFS) and Shortest-Job-First (SJF). The results showed that the WOS algorithm outperformed the traditional algorithms in reducing the waiting time of processes, particularly in scenarios with a large number of short tasks with long wait times. Our study highlights the effectiveness of the WOS algorithm in improving process scheduling efficiency in computer systems. By reducing process waiting time, the WOS algorithm can improve system performance and resource utilization. The findings of this study provide valuable insights for researchers and practitioners in developing and implementing efficient process scheduling algorithms.Keywords: process scheduling, wait-optimized scheduler, response time, non-preemptive, waiting time, traditional scheduling algorithms, first-come-first-serve, shortest-job-first, system performance, resource utilization
Procedia PDF Downloads 9115854 The Impact of the Business Process Reengineering on the Practices of the Human Resources Management in the Franco Tunisian Company-Network
Authors: Nesrine Bougarech, Habib Affes
Abstract:
This research lays the emphasis on the business process reengineering (BPR) which consists in radically altering the organizational processes through the optimal use of information technology (IT) to attain major enhancements in terms of quality, performance and productivity. A survey of the business process reengineering (BPR) was carried out in three French groups and their subsidiaries in Tunisia. The data collected were qualitatively analyzed in an attempt to test the main indicators of the success of a business process reengineering project (BPR) and to compare the importance of these indicators in the context of France versus Tunisia. The study corroborates that the respect of the inherent principles of the business process reengineering (BPR) and the diversity of the human resources involved in the project can lead to better productivity, higher quality of the goods or services and lower cost. Additionally, our results mirror the extent to which the respect of the principles and the diversity of resources are more important in the French companies than in their Tunisian subsidiaries.Keywords: business process reengineering (BPR), human resources management (HRM), information technology (IT), management
Procedia PDF Downloads 40815853 A Systematic Review of Process Research in Software Engineering
Authors: Tulasi Rayasa, Phani Kumar Pullela
Abstract:
A systematic review is a research method that involves collecting and evaluating the information on a specific topic in order to provide a comprehensive and unbiased review. This type of review aims to improve the software development process by ensuring that the research is thorough and accurate. To ensure objectivity, it is important to follow systematic guidelines and consider multiple sources, such as literature reviews, interviews, and surveys. The evaluation process should also be streamlined by incorporating research from journals and other sources, such as grey literature. The main goal of a systematic review is to identify the consistency of current models in the field of computer application and software engineering.Keywords: computer application, software engineering, process research, data science
Procedia PDF Downloads 9915852 Window Analysis and Malmquist Index for Assessing Efficiency and Productivity Growth in a Pharmaceutical Industry
Authors: Abbas Al-Refaie, Ruba Najdawi, Nour Bata, Mohammad D. AL-Tahat
Abstract:
The pharmaceutical industry is an important component of health care systems throughout the world. Measurement of a production unit-performance is crucial in determining whether it has achieved its objectives or not. This paper applies data envelopment (DEA) window analysis to assess the efficiencies of two packaging lines; Allfill (new) and DP6, in the Penicillin plant in a Jordanian Medical Company in 2010. The CCR and BCC models are used to estimate the technical efficiency, pure technical efficiency, and scale efficiency. Further, the Malmquist productivity index is computed to measure then employed to assess productivity growth relative to a reference technology. Two primary issues are addressed in computation of Malmquist indices of productivity growth. The first issue is the measurement of productivity change over the period, while the second is to decompose changes in productivity into what are generally referred to as a ‘catching-up’ effect (efficiency change) and a ‘frontier shift’ effect (technological change). Results showed that DP6 line outperforms the Allfill in technical and pure technical efficiency. However, the Allfill line outperforms DP6 line in scale efficiency. The obtained efficiency values can guide production managers in taking effective decisions related to operation, management, and plant size. Moreover, both machines exhibit a clear fluctuations in technological change, which is the main reason for the positive total factor productivity change. That is, installing a new Allfill production line can be of great benefit to increasing productivity. In conclusions, the DEA window analysis combined with the Malmquist index are supportive measures in assessing efficiency and productivity in pharmaceutical industry.Keywords: window analysis, malmquist index, efficiency, productivity
Procedia PDF Downloads 60915851 Seismic Performance Evaluation of Existing Building Using Structural Information Modeling
Authors: Byungmin Cho, Dongchul Lee, Taejin Kim, Minhee Lee
Abstract:
The procedure for the seismic retrofit of existing buildings includes the seismic evaluation. In the evaluation step, it is assessed whether the buildings have satisfactory performance against seismic load. Based on the results of that, the buildings are upgraded. To evaluate seismic performance of the buildings, it usually goes through the model transformation from elastic analysis to inelastic analysis. However, when the data is not delivered through the interwork, engineers should manually input the data. In this process, since it leads to inaccuracy and loss of information, the results of the analysis become less accurate. Therefore, in this study, the process for the seismic evaluation of existing buildings using structural information modeling is suggested. This structural information modeling makes the work economic and accurate. To this end, it is determined which part of the process could be computerized through the investigation of the process for the seismic evaluation based on ASCE 41. The structural information modeling process is developed to apply to the seismic evaluation using Perform 3D program usually used for the nonlinear response history analysis. To validate this process, the seismic performance of an existing building is investigated.Keywords: existing building, nonlinear analysis, seismic performance, structural information modeling
Procedia PDF Downloads 38415850 Understanding Tacit Knowledge and DIKW
Authors: Bahadir Aydin
Abstract:
Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW
Procedia PDF Downloads 51915849 The Extended Skew Gaussian Process for Regression
Authors: M. T. Alodat
Abstract:
In this paper, we propose a generalization to the Gaussian process regression(GPR) model called the extended skew Gaussian process for regression(ESGPr) model. The ESGPR model works better than the GPR model when the errors are skewed. We derive the predictive distribution for the ESGPR model at a new input. Also we apply the ESGPR model to FOREX data and we find that it fits the Forex data better than the GPR model.Keywords: extended skew normal distribution, Gaussian process for regression, predictive distribution, ESGPr model
Procedia PDF Downloads 55315848 Simulation of the Reactive Rotational Molding Using Smoothed Particle Hydrodynamics
Authors: A. Hamidi, S. Khelladi, L. Illoul, A. Tcharkhtchi
Abstract:
Reactive rotational molding (RRM) is a process to manufacture hollow plastic parts with reactive material has several advantages compared to conventional roto molding of thermoplastic powders: process cycle time is shorter; raw material is less expensive because polymerization occurs during processing and high-performance polymers may be used such as thermosets, thermoplastics or blends. However, several phenomena occur during this process which makes the optimization of the process quite complex. In this study, we have used a mixture of isocyanate and polyol as a reactive system. The chemical transformation of this system to polyurethane has been studied by thermal analysis and rheology tests. Thanks to these results of the curing process and rheological measurements, the kinetic and rheokinetik of polyurethane was identified. Smoothed Particle Hydrodynamics, a Lagrangian meshless method, was chosen to simulate reactive fluid flow in 2 and 3D configurations of the polyurethane during the process taking into account the chemical, and chemiorehological results obtained experimentally in this study.Keywords: reactive rotational molding, simulation, smoothed particle hydrodynamics, surface tension, rheology, free surface flows, viscoelastic, interpolation
Procedia PDF Downloads 28815847 Buoyancy Effects in Pressure Retarded Osmosis with Extremely High Draw Solution Concentration
Authors: Ivonne Tshuma, Ralf Cord-Ruwisch, Wendell Ela
Abstract:
Water crisis is a world-wide problem because of population growth and climate change. Hence, desalination is a solution to water scarcity, which threatens the world. Reverse osmosis (RO) is the most used technique for desalination; unfortunately, this process, usually requires high-pressure requirement hence requires a lot of energy about 3 – 5.5 KWhr/m³ of electrical energy. The pressure requirements of RO can be alleviated by the use of PRO (pressure retarded osmosis) to drive the RO process. This paper proposes a process of utilizing the energy directly from PRO to drive an RO process. The paper mostly analyses the PRO process parameters such as cross-flow velocity, density, and buoyancy and how these have an effect on PRO hence ultimately the RO process. The experimental study of the PRO with various feed solution concentrations and cross-flow velocities at fixed applied pressure with different orientations of the PRO cell was performed. The study revealed that without cross-flow velocity, buoyancy effects were observed but not with cross-flow velocity.Keywords: cross-flow velocity, pressure retarded osmosis, density, buoyancy
Procedia PDF Downloads 13715846 Linkages between Postponement Strategies and Flexibility in Organizations
Authors: Polycarpe Feussi
Abstract:
Globalization, technological and customer increasing changes, amongst other drivers, result in higher levels of uncertainty and unpredictability for organizations. In order for organizations to cope with the uncertain and fast-changing economic and business environment, these organizations need to innovate in order to achieve flexibility. In simple terms, the organizations must develop strategies leading to the ability of these organizations to provide horizontal information connections across the supply chain to create and deliver products that meet customer needs by synchronization of customer demands with product creation. The generated information will create efficiency and effectiveness throughout the whole supply chain regarding production, storage, and distribution, as well as eliminating redundant activities and reduction in response time. In an integrated supply chain, spanning activities include coordination with distributors and suppliers. This paper explains how through postponement strategies, flexibility can be achieved in an organization. In order to achieve the above, a thorough literature review was conducted via the search of online websites that contains material from scientific journal data-bases, articles, and textbooks on the subject of postponement and flexibility. The findings of the research are found in the last part of the paper. The first part introduces the concept of postponement and its importance in supply chain management. The second part of the paper provides the methodology used in the process of writing the paper.Keywords: postponement strategies, supply chain management, flexibility, logistics
Procedia PDF Downloads 19315845 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime
Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung
Abstract:
This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.Keywords: data fusion, round types speed hump, speed hump detection, surface filter
Procedia PDF Downloads 51015844 Performance Improvement of Piston Engine in Aeronautics by Means of Additive Manufacturing Technologies
Authors: G. Andreutti, G. Saccone, D. Lucariello, C. Pirozzi, S. Franchitti, R. Borrelli, C. Toscano, P. Caso, G. Ferraro, C. Pascarella
Abstract:
The reduction of greenhouse gases and pollution emissions is a worldwide environmental issue. The amount of CO₂ released by an aircraft is associated with the amount of fuel burned, so the improvement of engine thermo-mechanical efficiency and specific fuel consumption is a significant technological driver for aviation. Moreover, with the prospect that avgas will be phased out, an engine able to use more available and cheaper fuels is an evident advantage. An advanced aeronautical Diesel engine, because of its high efficiency and ability to use widely available and low-cost jet and diesel fuels, is a promising solution to achieve a more fuel-efficient aircraft. On the other hand, a Diesel engine has generally a higher overall weight, if compared with a gasoline one of same power performances. Fixing the MTOW, Max Take-Off Weight, and the operational payload, this extra-weight reduces the aircraft fuel fraction, partially vinifying the associated benefits. Therefore, an effort in weight saving manufacturing technologies is likely desirable. In this work, in order to achieve the mentioned goals, innovative Electron Beam Melting – EBM, Additive Manufacturing – AM technologies were applied to a two-stroke, common rail, GF56 Diesel engine, developed by the CMD Company for aeronautic applications. For this purpose, a consortium of academic, research and industrial partners, including CMD Company, Italian Aerospace Research Centre – CIRA, University of Naples Federico II and the University of Salerno carried out a technological project, funded by the Italian Minister of Education and Research – MIUR. The project aimed to optimize the baseline engine in order to improve its performance and increase its airworthiness features. This project was focused on the definition, design, development, and application of enabling technologies for performance improvement of GF56. Weight saving of this engine was pursued through the application of EBM-AM technologies and in particular using Arcam AB A2X machine, available at CIRA. The 3D printer processes titanium alloy micro-powders and it was employed to realize new connecting rods of the GF56 engine with an additive-oriented design approach. After a preliminary investigation of EBM process parameters and a thermo-mechanical characterization of titanium alloy samples, additive manufactured, innovative connecting rods were fabricated. These engine elements were structurally verified, topologically optimized, 3D printed and suitably post-processed. Finally, the overall performance improvement, on a typical General Aviation aircraft, was estimated, substituting the conventional engine with the optimized GF56 propulsion system.Keywords: aeronautic propulsion, additive manufacturing, performance improvement, weight saving, piston engine
Procedia PDF Downloads 14215843 Optimizing the Passenger Throughput at an Airport Security Checkpoint
Authors: Kun Li, Yuzheng Liu, Xiuqi Fan
Abstract:
High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.Keywords: queue theory, security check, stochatic process, Monte Carlo simulation
Procedia PDF Downloads 20015842 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations
Authors: Deepak Singh, Rail Kuliev
Abstract:
The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization
Procedia PDF Downloads 7015841 Performance Evaluation of Routing Protocol in Cognitive Radio with Multi Technological Environment
Authors: M. Yosra, A. Mohamed, T. Sami
Abstract:
Over the past few years, mobile communication technologies have seen significant evolution. This fact promoted the implementation of many systems in a multi-technological setting. From one system to another, the Quality of Service (QoS) provided to mobile consumers gets better. The growing number of normalized standards extends the available services for each consumer, moreover, most of the available radio frequencies have already been allocated, such as 3G, Wifi, Wimax, and LTE. A study by the Federal Communications Commission (FCC) found that certain frequency bands are partially occupied in particular locations and times. So, the idea of Cognitive Radio (CR) is to share the spectrum between a primary user (PU) and a secondary user (SU). The main objective of this spectrum management is to achieve a maximum rate of exploitation of the radio spectrum. In general, the CR can greatly improve the quality of service (QoS) and improve the reliability of the link. The problem will reside in the possibility of proposing a technique to improve the reliability of the wireless link by using the CR with some routing protocols. However, users declared that the links were unreliable and that it was an incompatibility with QoS. In our case, we choose the QoS parameter "bandwidth" to perform a supervised classification. In this paper, we propose a comparative study between some routing protocols, taking into account the variation of different technologies on the existing spectral bandwidth like 3G, WIFI, WIMAX, and LTE. Due to the simulation results, we observe that LTE has significantly higher availability bandwidth compared with other technologies. The performance of the OLSR protocol is better than other on-demand routing protocols (DSR, AODV and DSDV), in LTE technology because of the proper receiving of packets, less packet drop and the throughput. Numerous simulations of routing protocols have been made using simulators such as NS3.Keywords: cognitive radio, multi technology, network simulator (NS3), routing protocol
Procedia PDF Downloads 6315840 Telemedicine in Physician Assistant Education: A Partnership with Community Agency
Authors: Martina I. Reinhold, Theresa Bacon-Baguley
Abstract:
A core challenge of physician assistant education is preparing professionals for lifelong learning. While this conventionally has encompassed scientific advances, students must also embrace new care delivery models and technologies. Telemedicine, the provision of care via two-way audio and video, is an example of a technological advance reforming health care. During a three-semester sequence of Hospital Community Experiences, physician assistant students were assigned experiences with Answer Health on Demand, a telemedicine collaborative. Preceding the experiences, the agency lectured on the application of telemedicine. Students were then introduced to the technology and partnered with a provider. Prior to observing the patient-provider interaction, patient consent was obtained. Afterwards, students completed a reflection paper on lessons learned and the potential impact of telemedicine on their careers. Thematic analysis was completed on the students’ reflection papers (n=13). Preceding the lecture and experience, over 75% of students (10/13) were unaware of telemedicine. Several stated they were 'skeptical' about the effectiveness of 'impersonal' health care appointments. After the experience, all students remarked that telemedicine will play a large role in the future of healthcare and will provide benefits by improving access in rural areas, decreasing wait time, and saving cost. More importantly, 30% of students (4/13) commented that telemedicine is a technology they can see themselves using in their future practice. Initial results indicate that collaborative interaction between students and telemedicine providers enhanced student learning and exposed students to technological advances in the delivery of care. Further, results indicate that students perceived telemedicine more favorably as a viable delivery method after the experience.Keywords: collaboration, physician assistant education, teaching innovative health care delivery method, telemedicine
Procedia PDF Downloads 19715839 A Review on Building Information Modelling in Nigeria and Its Potentials
Authors: Mansur Hamma-Adama, Tahar Kouider
Abstract:
Construction Industry has been evolving since the development of Building Information Modelling (BIM). This technological process is unstoppable; it is out to the market with remarkable case studies of solving the long industry’s history of fragmentation. This industry has been changing over time; United States has recorded the most significant development in construction digitalization, Australia, United Kingdom and some other developed nations are also amongst promoters of BIM process and its development. Recently, a developing country like China and Malaysia are keying into the industry’s digital shift, while very little move is seen in South Africa whose development is considered higher and perhaps leader in the digital transition amongst the African countries. To authors’ best knowledge, Nigerian construction industry has never engaged in BIM discussions hence has no attention at national level. Consequently, Nigeria has no “Noteworthy BIM publications.” Decision makers and key stakeholders need to be informed on the current trend of the industry’s development (BIM in specific) and the opportunities of adopting this digitalization trend in relation to the identified challenges. BIM concept can be traced mostly in Architectural practices than engineering practices in Nigeria. A superficial BIM practice is found to be at organisational level only and operating a model based - “BIM stage 1.” Research to adopting this innovation has received very little attention. This piece of work is literature review based, aimed at exploring BIM in Nigeria and its prospects. The exploration reveals limitations in the literature availability as to extensive research in the development of BIM in the country. Numerous challenges were noticed including building collapse, inefficiencies, cost overrun and late project delivery. BIM has potentials to overcome the above challenges and even beyond. Low level of BIM adoption with reasonable level of awareness is noticed. However, lack of policy and guideline as well as serious lack of experts in the field are amongst the major barriers to BIM adoption. The industry needs to embrace BIM to possibly compete with its global counterpart.Keywords: adoption, BIM, CAD, construction industry, Nigeria, opportunities
Procedia PDF Downloads 15415838 Automated CNC Part Programming and Process Planning for Turned Components
Authors: Radhey Sham Rajoria
Abstract:
Pressure to increase the competitiveness in the manufacturing sector and for the survival in the market has led to the development of machining centres, which enhance productivity, improve quality, shorten the lead time, and reduce the manufacturing cost. With the innovation of machining centres in the manufacturing sector the production lines have been replaced by these machining centers, having the ability to machine various processes and multiple tooling with automatic tool changer (ATC) for the same part. Also the process plans can be easily generated for complex components. Some means are required to utilize the machining center at its best. The present work is concentrated on the automated part program generation, and in turn automated process plan generation for the turned components on Denford “MIRAC” 8 stations ATC lathe machining centre. A package in C++ on DOS platform is developed which generates the complete CNC part program, process plan and process sequence for the turned components. The input to this system is in the form of a blueprint in graphical format with machining parameters and variables, and the output is the CNC part program which is stored in a .mir file, ready for execution on the machining centre.Keywords: CNC, MIRAC, ATC, process planning
Procedia PDF Downloads 26915837 The Use of Voice in Online Public Access Catalog as Faster Searching Device
Authors: Maisyatus Suadaa Irfana, Nove Eka Variant Anna, Dyah Puspitasari Sri Rahayu
Abstract:
Technological developments provide convenience to all the people. Nowadays, the communication of human with the computer is done via text. With the development of technology, human and computer communications have been conducted with a voice like communication between human beings. It provides an easy facility for many people, especially those who have special needs. Voice search technology is applied in the search of book collections in the OPAC (Online Public Access Catalog), so library visitors will find it faster and easier to find books that they need. Integration with Google is needed to convert the voice into text. To optimize the time and the results of searching, Server will download all the book data that is available in the server database. Then, the data will be converted into JSON format. In addition, the incorporation of some algorithms is conducted including Decomposition (parse) in the form of array of JSON format, the index making, analyzer to the result. It aims to make the process of searching much faster than the usual searching in OPAC because the data are directly taken to the database for every search warrant. Data Update Menu is provided with the purpose to enable users perform their own data updates and get the latest data information.Keywords: OPAC, voice, searching, faster
Procedia PDF Downloads 34415836 Application of Lean Six Sigma Tools to Minimize Time and Cost in Furniture Packaging
Authors: Suleiman Obeidat, Nabeel Mandahawi
Abstract:
In this work, the packaging process for a move is improved. The customers of this move need their household stuff to be moved from their current house to the new one with minimum damage, in an organized manner, on time and with the minimum cost. Our goal was to improve the process between 10% and 20% time efficiency, 90% reduction in damaged parts and an acceptable improvement in the cost of the total move process. The expected ROI was 833%. Many improvement techniques have been used in terms of the way the boxes are prepared, their preparation cost, packing the goods, labeling them and moving them to a place for moving out. DMAIC technique is used in this work: SIPOC diagram, value stream map of “As Is” process, Root Cause Analysis, Maps of “Future State” and “Ideal State” and an Improvement Plan. A value of ROI=624% is obtained which is lower than the expected value of 833%. The work explains the techniques of improvement and the deficiencies in the old process.Keywords: packaging, lean tools, six sigma, DMAIC methodology, SIPOC
Procedia PDF Downloads 42815835 Accelerated Aging of Photopolymeric Material Used in Flexography
Authors: S. Mahovic Poljacek, T. Tomasegovic, T. Cigula, D. Donevski, R. Szentgyörgyvölgyi, S. Jakovljevic
Abstract:
In this paper, a degradation of the photopolymeric material (PhPM), used as printing plate in the flexography reproduction technique, caused by accelerated aging has been observed. Since the basis process for production of printing plates from the PhPM is a radical cross-linking process caused by exposing to UV wavelengths, the assumption was that improper storage or irregular handling of the PhPM plate can change the surface and structure characteristics of the plates. Results have shown that the aging process causes degradation in the structure and changes in the surface of the PhPM printing plate.Keywords: aging process, artificial treatment, flexography, photopolymeric material (PhPM)
Procedia PDF Downloads 34915834 A Detailed Experimental Study and Evaluation of Springback under Stretch Bending Process
Authors: A. Soualem
Abstract:
The design of multi stage deep drawing processes requires the evaluation of many process parameters such as the intermediate die geometry, the blank shape, the sheet thickness, the blank holder force, friction, lubrication etc..These process parameters have to be determined for the optimum forming conditions before the process design. In general sheet metal forming may involve stretching drawing or various combinations of these basic modes of deformation. It is important to determine the influence of the process variables in the design of sheet metal working process. Especially, the punch and die corner for deep drawing will affect the formability. At the same time the prediction of sheet metals springback after deep drawing is an important issue to solve for the control of manufacturing processes. Nowadays, the importance of this problem increases because of the use of steel sheeting with high stress and also aluminum alloys. The aim of this paper is to give a better understanding of the springback and its effect in various sheet metals forming process such as expansion and restraint deep drawing in the cup drawing process, by varying radius die, lubricant for two commercially available materials e.g. galvanized steel and Aluminum sheet. To achieve these goals experiments were carried out and compared with other results. The original of our purpose consist on tests which are ensured by adapting a U-type stretching-bending device on a tensile testing machine, where we studied and quantified the variation of the springback.Keywords: springback, deep drawing, expansion, restricted deep drawing
Procedia PDF Downloads 45415833 Islamic Banking Recovery Process and Its Parameters: A Practitioner’s Viewpoints in the Light of Humanising Financial Services
Authors: Muhammad Izzam Bin Mohd Khazar, Nur Adibah Binti Zainudin
Abstract:
Islamic banking as one of the financial institutions is highly required to maintain a prudent approach to ensure that any financing given is able to generate income to their respective shareholders. As the default payment of customers is probably occurred in the financing, having a prudent approach in the recovery process is a must to ensure that financing losses are within acceptable limits. The objective of this research is to provide the best practice of recovery which is anticipated to benefit both bank and customers. This study will address arising issue on the current practice of recovery process and followed by providing humanising recovery solutions in the light of the Maqasid Shariah. The study identified main issues pertaining to Islamic recovery process which can be categorized into knowledge crisis, process issues, specific treatment cases and system issues. Knowledge crisis is related to direct parties including judges, solicitors and salesperson, while the recovery process issues include the process of issuance of reminder, foreclosure and repossession of asset. Furthermore, special treatment for particular cases also should be observed since different contracts in Islamic banking products will need different treatment. Finally, issues in the system used in the recovery process are still unresolved since the existing technology is still young in this area to embraced Islamic finance requirements and nature of calculation. In order to humanize the financial services in Islamic banking recovery process, we have highlighted four main recommendation to be implemented by Islamic Financial Institutions namely; 1) early deterrent by improving the awareness, 2) improvement of the internal process, 3) reward mechanism, and 4) creative penalty to provide awareness to all stakeholders.Keywords: humanizing financial services, Islamic Finance, Maqasid Syariah, recovery process
Procedia PDF Downloads 20515832 Effect of Distance Education Students Motivation with the Turkish Language and Literature Course
Authors: Meva Apaydin, Fatih Apaydin
Abstract:
Role of education in the development of society is great. Teaching and training started with the beginning of the history and different methods and techniques which have been applied as the time passed and changed everything with the aim of raising the level of learning. In addition to the traditional teaching methods, technology has been used in recent years. With the beginning of the use of internet in education, some problems which could not be soluted till that time has been dealt and it is inferred that it is possible to educate the learners by using contemporary methods as well as traditional methods. As an advantage of technological developments, distance education is a system which paves the way for the students to be educated individually wherever and whenever they like without the needs of physical school environment. Distance education has become prevalent because of the physical inadequacies in education institutions, as a result; disadvantageous circumstances such as social complexities, individual differences and especially geographical distance disappear. What’s more, the high-speed of the feedbacks between teachers and learners, improvement in student motivation because there is no limitation of time, low-cost, the objective measuring and evaluation are on foreground. In spite of the fact that there is teaching beneficences in distance education, there are also limitations. Some of the most important problems are that : Some problems which are highly possible to come across may not be solved in time, lack of eye-contact between the teacher and the learner, so trust-worthy feedback cannot be got or the problems stemming from the inadequate technological background are merely some of them. Courses are conducted via distance education in many departments of the universities in our country. In recent years, giving lectures such as Turkish Language, English, and History in the first grades of the academic departments in the universities is an application which is constantly becoming prevalent. In this study, the application of Turkish Language course via distance education system by analyzing advantages and disadvantages of the distance education system which is based on internet.Keywords: distance education, Turkish language, motivation, benefits
Procedia PDF Downloads 43615831 Indium-Gallium-Zinc Oxide Photosynaptic Device with Alkylated Graphene Oxide for Optoelectronic Spike Processing
Authors: Seyong Oh, Jin-Hong Park
Abstract:
Recently, neuromorphic computing based on brain-inspired artificial neural networks (ANNs) has attracted huge amount of research interests due to the technological abilities to facilitate massively parallel, low-energy consuming, and event-driven computing. In particular, research on artificial synapse that imitate biological synapses responsible for human information processing and memory is in the spotlight. Here, we demonstrate a photosynaptic device, wherein a synaptic weight is governed by a mixed spike consisting of voltage and light spikes. Compared to the device operated only by the voltage spike, ∆G in the proposed photosynaptic device significantly increased from -2.32nS to 5.95nS with no degradation of nonlinearity (NL) (potentiation/depression values were changed from 4.24/8 to 5/8). Furthermore, the Modified National Institute of Standards and Technology (MNIST) digit pattern recognition rates improved from 36% and 49% to 50% and 62% in ANNs consisting of the synaptic devices with 20 and 100 weight states, respectively. We expect that the photosynaptic device technology processed by optoelectronic spike will play an important role in implementing the neuromorphic computing systems in the future.Keywords: optoelectronic synapse, IGZO (Indium-Gallium-Zinc Oxide) photosynaptic device, optoelectronic spiking process, neuromorphic computing
Procedia PDF Downloads 17315830 The Rite of Jihadification in ISIS Modified Video Games: Mass Deception and Dialectic of Religious Regression in Technological Progression
Authors: Venus Torabi
Abstract:
ISIS, the terrorist organization, modified two videogames, ARMA III and Grand Theft Auto 5 (2013) as means of online recruitment and ideological propaganda. The urge to study the mechanism at work, whether it has been successful or not, derives (Digital) Humanities experts to explore how codes of terror, Islamic ideology and recruitment strategies are incorporated into the ludic mechanics of videogames. Another aspect of the significance lies in the fact that this is a latent problem that has not been fully addressed in an interdisciplinary framework prior to this study, to the best of the researcher’s knowledge. Therefore, due to the complexity of the subject, the present paper entangles with game studies, philosophical and religious poles to form the methodology of conducting the research. As a contextualized epistemology of such exploitation of videogames, the core argument is building on the notion of “Culture Industry” proposed by Theodore W. Adorno and Max Horkheimer in Dialectic of Enlightenment (2002). This article posits that the ideological underpinnings of ISIS’s cause corroborated by the action-bound mechanics of the videogames are in line with adhering to the Islamic Eschatology as a furnishing ground and an excuse in exercising terrorism. It is an account of ISIS’s modification of the videogames, a tool of technological progression to practice online radicalization. Dialectically, this practice is packed up in rhetoric for recognizing a religious myth (the advent of a savior), as a hallmark of regression. The study puts forth that ISIS’s wreaking havoc on the world, both in reality and within action videogames, is negotiating the process of self-assertion in the players of such videogames (by assuming one’s self a member of terrorists) that leads to self-annihilation. It tries to unfold how ludic Mod videogames are misused as tools of mass deception towards ethnic cleansing in reality and line with the distorted Eschatological myth. To conclude, this study posits videogames to be a new avenue of mass deception in the framework of the Culture Industry. Yet, this emerges as a two-edged sword of mass deception in ISIS’s modification of videogames. It shows that ISIS is not only trying to hijack the minds through online/ludic recruitment, it potentially deceives the Muslim communities or those prone to radicalization into believing that it's terrorist practices are preparing the world for the advent of a religious savior based on Islamic Eschatology. This is to claim that the harsh actions of the videogames are potentially breeding minds by seeds of terrorist propaganda and numbing them to violence. The real world becomes an extension of that harsh virtual environment in a ludic/actual continuum, the extension that is contributing to the mass deception mechanism of the terrorists, in a clandestine trend.Keywords: culture industry, dialectic, ISIS, islamic eschatology, mass deception, video games
Procedia PDF Downloads 13715829 A Comparison between Fuzzy Analytic Hierarchy Process and Fuzzy Analytic Network Process for Rationality Evaluation of Land Use Planning Locations in Vietnam
Authors: X. L. Nguyen, T. Y. Chou, F. Y. Min, F. C. Lin, T. V. Hoang, Y. M. Huang
Abstract:
In Vietnam, land use planning is utilized as an efficient tool for the local government to adjust land use. However, planned locations are facing disapproval from people who live near these planned sites because of environmental problems. The selection of these locations is normally based on the subjective opinion of decision-makers and is not supported by any scientific methods. Many researchers have applied Multi-Criteria Analysis (MCA) methods in which Analytic Hierarchy Process (AHP) is the most popular techniques in combination with Fuzzy set theory for the subject of rationality assessment of land use planning locations. In this research, the Fuzzy set theory and Analytic Network Process (ANP) multi-criteria-based technique were used for the assessment process. The Fuzzy Analytic Hierarchy Process was also utilized, and the output results from two methods were compared to extract the differences. The 20 planned landfills in Hung Ha district, Thai Binh province, Vietnam was selected as a case study. The comparison results indicate that there are different between weights computed by AHP and ANP methods and the assessment outputs produced from these two methods also slight differences. After evaluation of existing planned sites, some potential locations were suggested to the local government for possibility of land use planning adjusts.Keywords: Analytic Hierarchy Process, Analytic Network Process, Fuzzy set theory, land use planning
Procedia PDF Downloads 42115828 Devulcanization of Waste Rubber Using Thermomechanical Method Combined with Supercritical CO₂
Authors: L. Asaro, M. Gratton, S. Seghar, N. Poirot, N. Ait Hocine
Abstract:
Rubber waste disposal is an environmental problem. Particularly, many researches are centered in the management of discarded tires. In spite of all different ways of handling used tires, the most common is to deposit them in a landfill, creating a stock of tires. These stocks can cause fire danger and provide ambient for rodents, mosquitoes and other pests, causing health hazards and environmental problems. Because of the three-dimensional structure of the rubbers and their specific composition that include several additives, their recycling is a current technological challenge. The technique which can break down the crosslink bonds in the rubber is called devulcanization. Strictly, devulcanization can be defined as a process where poly-, di-, and mono-sulfidic bonds, formed during vulcanization, are totally or partially broken. In the recent years, super critical carbon dioxide (scCO₂) was proposed as a green devulcanization atmosphere. This is because it is chemically inactive, nontoxic, nonflammable and inexpensive. Its critical point can be easily reached (31.1 °C and 7.38 MPa), and residual scCO₂ in the devulcanized rubber can be easily and rapidly removed by releasing pressure. In this study thermomechanical devulcanization of ground tire rubber (GTR) was performed in a twin screw extruder under diverse operation conditions. Supercritical CO₂ was added in different quantities to promote the devulcanization. Temperature, screw speed and quantity of CO₂ were the parameters that were varied during the process. The devulcanized rubber was characterized by its devulcanization percent and crosslink density by swelling in toluene. Infrared spectroscopy (FTIR) and Gel permeation chromatography (GPC) were also done, and the results were related with the Mooney viscosity. The results showed that the crosslink density decreases as the extruder temperature and speed increases, and, as expected, the soluble fraction increase with both parameters. The Mooney viscosity of the devulcanized rubber decreases as the extruder temperature increases. The reached values were in good correlation (R= 0.96) with de the soluble fraction. In order to analyze if the devulcanization was caused by main chains or crosslink scission, the Horikx's theory was used. Results showed that all tests fall in the curve that corresponds to the sulfur bond scission, which indicates that the devulcanization has successfully happened without degradation of the rubber. In the spectra obtained by FTIR, it was observed that none of the characteristic peaks of the GTR were modified by the different devulcanization conditions. This was expected, because due to the low sulfur content (~1.4 phr) and the multiphasic composition of the GTR, it is very difficult to evaluate the devulcanization by this technique. The lowest crosslink density was reached with 1 cm³/min of CO₂, and the power consumed in that process was also near to the minimum. These results encourage us to do further analyses to better understand the effect of the different conditions on the devulcanization process. The analysis is currently extended to monophasic rubbers as ethylene propylene diene monomer rubber (EPDM) and natural rubber (NR).Keywords: devulcanization, recycling, rubber, waste
Procedia PDF Downloads 38515827 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society
Authors: Irene Yi
Abstract:
Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.Keywords: computational analysis, gendered grammar, misogynistic language, neural networks
Procedia PDF Downloads 119