Search results for: deep drawing process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17363

Search results for: deep drawing process

16493 A Resilience Process Model of Natural Gas Pipeline Systems

Authors: Zhaoming Yang, Qi Xiang, Qian He, Michael Havbro Faber, Enrico Zio, Huai Su, Jinjun Zhang

Abstract:

Resilience is one of the key factors for system safety assessment and optimization, and resilience studies of natural gas pipeline systems (NGPS), especially in terms of process descriptions, are still being explored. Based on the three main stages, which are function loss process, recovery process, and waiting process, the paper has built functions and models which are according to the practical characteristics of NGPS and mainly analyzes the characteristics of deterministic interruptions. The resilience of NGPS also considers the threshold of the system function or users' satisfaction. The outcomes, which quantify the resilience of NGPS in different evaluation views, can be combined with the max flow and shortest path methods, help with the optimization of extra gas supplies and gas routes as well as pipeline maintenance strategies, the quick analysis of disturbance effects and the improvement of NGPS resilience evaluation accuracy.

Keywords: natural gas pipeline system, resilience, process modeling, deterministic disturbance

Procedia PDF Downloads 120
16492 State of Art in Software Requirement Negotiation Process Models

Authors: Shamsu Abdullahi, Nazir Yusuf, Hazrina Sofian, Abubakar Zakari, Amina Nura, Salisu Suleiman

Abstract:

Requirements negotiation process models help in resolving conflicting requirements of the heterogeneous stakeholders in the software development industry. This is to achieve a shared vision of software projects to be developed by the industry. Negotiating stakeholder agreements is a serious and difficult task in the software development process. There are many requirements negotiation process models that effectively negotiate stakeholder agreements that have been proposed by the research community. Other issues in the requirements negotiation research domain include stakeholder communication, decision-making, lack of negotiation interoperability, and managing requirement changes and analysis. This study highlights the current state of the art in the existing software requirements negotiation process models. The study also describes the issues and limitations in the software requirements negotiations process models.

Keywords: requirements, negotiation, stakeholders, agreements

Procedia PDF Downloads 192
16491 Cantilever Shoring Piles with Prestressing Strands: An Experimental Approach

Authors: Hani Mekdash, Lina Jaber, Yehia Temsah

Abstract:

Underground space is becoming a necessity nowadays, especially in highly congested urban areas. Retaining underground excavations using shoring systems is essential in order to protect adjoining structures from potential damage or collapse. Reinforced Concrete Piles (RCP) supported by multiple rows of tie-back anchors are commonly used type of shoring systems in deep excavations. However, executing anchors can sometimes be challenging because they might illegally trespass neighboring properties or get obstructed by infrastructure and other underground facilities. A technique is proposed in this paper, and it involves the addition of eccentric high-strength steel strands to the RCP section through ducts without providing the pile with lateral supports. The strands are then vertically stressed externally on the pile cap using a hydraulic jack, creating a compressive strengthening force in the concrete section. An experimental study about the behavior of the shoring wall by pre-stressed piles is presented during the execution of an open excavation in an urban area (Beirut city) followed by numerical analysis using finite element software. Based on the experimental results, this technique is proven to be cost-effective and provides flexible and sustainable construction of shoring works.

Keywords: deep excavation, prestressing, pre-stressed piles, shoring system

Procedia PDF Downloads 115
16490 Machine Learning Predictive Models for Hydroponic Systems: A Case Study Nutrient Film Technique and Deep Flow Technique

Authors: Kritiyaporn Kunsook

Abstract:

Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), decision tree, support vector machines (SVMs), Naïve Bayes, and ensemble classifier by voting are powerful data driven methods that are relatively less widely used in the mapping of technique of system, and thus have not been comparatively evaluated together thoroughly in this field. The performances of a series of MLAs, ANNs, decision tree, SVMs, Naïve Bayes, and ensemble classifier by voting in technique of hydroponic systems prospectively modeling are compared based on the accuracy of each model. Classification of hydroponic systems only covers the test samples from vegetables grown with Nutrient film technique (NFT) and Deep flow technique (DFT). The feature, which are the characteristics of vegetables compose harvesting height width, temperature, require light and color. The results indicate that the classification performance of the ANNs is 98%, decision tree is 98%, SVMs is 97.33%, Naïve Bayes is 96.67%, and ensemble classifier by voting is 98.96% algorithm respectively.

Keywords: artificial neural networks, decision tree, support vector machines, naïve Bayes, ensemble classifier by voting

Procedia PDF Downloads 368
16489 Blue Economy and Marine Mining

Authors: Fani Sakellariadou

Abstract:

The Blue Economy includes all marine-based and marine-related activities. They correspond to established, emerging as well as unborn ocean-based industries. Seabed mining is an emerging marine-based activity; its operations depend particularly on cutting-edge science and technology. The 21st century will face a crisis in resources as a consequence of the world’s population growth and the rising standard of living. The natural capital stored in the global ocean is decisive for it to provide a wide range of sustainable ecosystem services. Seabed mineral deposits were identified as having a high potential for critical elements and base metals. They have a crucial role in the fast evolution of green technologies. The major categories of marine mineral deposits are deep-sea deposits, including cobalt-rich ferromanganese crusts, polymetallic nodules, phosphorites, and deep-sea muds, as well as shallow-water deposits including marine placers. Seabed mining operations may take place within continental shelf areas of nation-states. In international waters, the International Seabed Authority (ISA) has entered into 15-year contracts for deep-seabed exploration with 21 contractors. These contracts are for polymetallic nodules (18 contracts), polymetallic sulfides (7 contracts), and cobalt-rich ferromanganese crusts (5 contracts). Exploration areas are located in the Clarion-Clipperton Zone, the Indian Ocean, the Mid Atlantic Ridge, the South Atlantic Ocean, and the Pacific Ocean. Potential environmental impacts of deep-sea mining include habitat alteration, sediment disturbance, plume discharge, toxic compounds release, light and noise generation, and air emissions. They could cause burial and smothering of benthic species, health problems for marine species, biodiversity loss, reduced photosynthetic mechanism, behavior change and masking acoustic communication for mammals and fish, heavy metals bioaccumulation up the food web, decrease of the content of dissolved oxygen, and climate change. An important concern related to deep-sea mining is our knowledge gap regarding deep-sea bio-communities. The ecological consequences that will be caused in the remote, unique, fragile, and little-understood deep-sea ecosystems and inhabitants are still largely unknown. The blue economy conceptualizes oceans as developing spaces supplying socio-economic benefits for current and future generations but also protecting, supporting, and restoring biodiversity and ecological productivity. In that sense, people should apply holistic management and make an assessment of marine mining impacts on ecosystem services, including the categories of provisioning, regulating, supporting, and cultural services. The variety in environmental parameters, the range in sea depth, the diversity in the characteristics of marine species, and the possible proximity to other existing maritime industries cause a span of marine mining impact the ability of ecosystems to support people and nature. In conclusion, the use of the untapped potential of the global ocean demands a liable and sustainable attitude. Moreover, there is a need to change our lifestyle and move beyond the philosophy of single-use. Living in a throw-away society based on a linear approach to resource consumption, humans are putting too much pressure on the natural environment. Applying modern, sustainable and eco-friendly approaches according to the principle of circular economy, a substantial amount of natural resource savings will be achieved. Acknowledgement: This work is part of the MAREE project, financially supported by the Division VI of IUPAC. This work has been partly supported by the University of Piraeus Research Center.

Keywords: blue economy, deep-sea mining, ecosystem services, environmental impacts

Procedia PDF Downloads 79
16488 Fabrication of Silicon Solar Cells Using All Sputtering Process

Authors: Ching-Hua Li, Sheng-Hui Chen

Abstract:

Sputtering is a popular technique with many advantages for thin film deposition. To fabricate a hydrogenated silicon thin film using sputtering process for solar cell applications, the ion bombardment during sputtering will generate microstructures (voids and columnar structures) to form silicon dihydride bodings as defects. The properties of heterojunction silicon solar cells were studied by using boron grains and silicon-boron targets. Finally, an 11.7% efficiency of solar cell was achieved by using all sputtering process.

Keywords: solar cell, sputtering process, pvd, alloy target

Procedia PDF Downloads 576
16487 Metal Ship and Robotic Car: A Hands-On Activity to Develop Scientific and Engineering Skills for High School Students

Authors: Jutharat Sunprasert, Ekapong Hirunsirisawat, Narongrit Waraporn, Somporn Peansukmanee

Abstract:

Metal Ship and Robotic Car is one of the hands-on activities in the course, the Fundamental of Engineering that can be divided into three parts. The first part, the metal ships, was made by using engineering drawings, physics and mathematics knowledge. The second part is where the students learned how to construct a robotic car and control it using computer programming. In the last part, the students had to combine the workings of these two objects in the final testing. This aim of study was to investigate the effectiveness of hands-on activity by integrating Science, Technology, Engineering and Mathematics (STEM) concepts to develop scientific and engineering skills. The results showed that the majority of students felt this hands-on activity lead to an increased confidence level in the integration of STEM. Moreover, 48% of all students engaged well with the STEM concepts. Students could obtain the knowledge of STEM through hands-on activities with the topics science and mathematics, engineering drawing, engineering workshop and computer programming; most students agree and strongly agree with this learning process. This indicated that the hands-on activity: “Metal Ship and Robotic Car” is a useful tool to integrate each aspect of STEM. Furthermore, hands-on activities positively influence a student’s interest which leads to increased learning achievement and also in developing scientific and engineering skills.

Keywords: hands-on activity, STEM education, computer programming, metal work

Procedia PDF Downloads 460
16486 An Improved Convolution Deep Learning Model for Predicting Trip Mode Scheduling

Authors: Amin Nezarat, Naeime Seifadini

Abstract:

Trip mode selection is a behavioral characteristic of passengers with immense importance for travel demand analysis, transportation planning, and traffic management. Identification of trip mode distribution will allow transportation authorities to adopt appropriate strategies to reduce travel time, traffic and air pollution. The majority of existing trip mode inference models operate based on human selected features and traditional machine learning algorithms. However, human selected features are sensitive to changes in traffic and environmental conditions and susceptible to personal biases, which can make them inefficient. One way to overcome these problems is to use neural networks capable of extracting high-level features from raw input. In this study, the convolutional neural network (CNN) architecture is used to predict the trip mode distribution based on raw GPS trajectory data. The key innovation of this paper is the design of the layout of the input layer of CNN as well as normalization operation, in a way that is not only compatible with the CNN architecture but can also represent the fundamental features of motion including speed, acceleration, jerk, and Bearing rate. The highest prediction accuracy achieved with the proposed configuration for the convolutional neural network with batch normalization is 85.26%.

Keywords: predicting, deep learning, neural network, urban trip

Procedia PDF Downloads 133
16485 Understanding the Importance of Participation in the City Planning Process and Its Influencing Factors

Authors: Louis Nwachi

Abstract:

Urban planning systems in most countries still rely on expert-driven, top-down technocratic plan-making processes rather than a public and people-led process. This paper set out to evaluate the need for public participation in the plan-making process and to highlight the factors that affect public participation in the plan-making process. In doing this, it adopted a qualitative approach based on document review and interviews taken from real-world phenomena. A case study strategy using the Metropolitan Area of Abuja, the capital of Nigeria, as the study sample was used in carrying out the research. The research finds that participation is an important tool in the plan-making process and that public engagement in the process contributes to the identification of key urban issues that are unique to the specific local areas, thereby contributing to the establishment of priorities and, in turn, to the mobilization of resources to meet the identified needs. It also finds that the development of a participation model by city authorities encourages public engagement and helps to develop trust between those in authority and the different key stakeholder groups involved in the plan-making process.

Keywords: plan-making, participation, urban planning, city

Procedia PDF Downloads 98
16484 Quality Based Approach for Efficient Biologics Manufacturing

Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama

Abstract:

To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.

Keywords: antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering

Procedia PDF Downloads 340
16483 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 299
16482 Flowsheet Development, Simulation and Optimization of Carbon-Di-Oxide Removal System at Natural Gas Reserves by Aspen–Hysys Process Simulator

Authors: Mohammad Ruhul Amin, Nusrat Jahan

Abstract:

Natural gas is a cleaner fuel compared to the others. But it needs some treatment before it is in a state to be used. So natural gas purification is an integral part of any process where natural gas is used as raw material or fuel. There are several impurities in natural gas that have to be removed before use. CO2 is one of the major contaminants. In this project we have removed CO2 by amine process by using MEA solution. We have built up the whole amine process for removing CO2 in Aspen Hysys and simulated the process. At the end of simulation we have got very satisfactory results by using MEA solution for the removal of CO2. Simulation result shows that amine absorption process enables to reduce CO2 content from NG by 58%. HYSYS optimizer allowed us to get a perfect optimized plant. After optimization the profit of existing plant is increased by 2.34 %.Simulation and optimization by Aspen-HYSYS simulator makes available us to enormous information which will help us to further research in future.

Keywords: Aspen–Hysys, CO2 removal, flowsheet development, MEA solution, natural gas optimization

Procedia PDF Downloads 493
16481 Assessing EU-China Security Interests from Contradiction to Convergence

Authors: Julia Gurol

Abstract:

Why do we observe a shift towards convergence in EU-China security interests? While contradicting attitudes towards key principles of inter-state and region-to-state relations, including state sovereignty, territorial integrity, and intervention policies have ever since hindered EU-China inter-regional cooperation beyond the economic realm, collaboration in peace and security issues is now becoming a key pillar of European-Chinese relations. In addition, the Belt and Road Initiative as most ambitious Chinese foreign policy project explicitly touches upon several European foreign policy and security preferences. Based on these counterintuitive findings, this paper traces the process of convergence of Sino-European security interests. Drawing on qualitative text analysis of official Chinese and European policy papers and documents from the establishment of diplomatic relations in 1975 until today, it assesses the striking change over time. On this basis, the paper uses theories of neo-functionalism, inter-regionalism, and securitization and borrows from constructivist views in International Relations’ theory, to expound possible motives for the change in Chinese and respectively European preferences in the security realm. The results reveal interesting insights into the decisive factors and motives behind both countries’ foreign policies. The paper concludes with a discussion of further potential and difficulties of EU-China security cooperation.

Keywords: belt and road initiative, China, European Union, foreign policy, neo-functionalism, security

Procedia PDF Downloads 280
16480 Experimental Investigation on Freeze-Concentration Process Desalting for Highly Saline Brines

Authors: H. Al-Jabli

Abstract:

Using the freeze-melting process for the disposing of high saline brines was the aim of the paper by confirming the performance estimation of the treatment system. A laboratory bench scale freezing technique test unit was designed, constructed, and tested at Doha Research Plant (DRP) in Kuwait. The principal unit operations that have been considered for the laboratory study are: ice crystallization, separation, washing, and melting. The applied process is characterized as “the secondary-refrigerant indirect freezing”, which is utilizing normal freezing concept. The high saline brine was used as definite feed water, i.e. average TDS of 250,000 ppm. Kuwait desalination plants were carried out in the experimental study to measure the performance of the proposed treatment system. Experimental analysis shows that the freeze-melting process is capable of dropping the TDS of the feed water from 249,482 ppm to 56,880 ppm of the freeze-melting process in the two-phase’s course, whereas overall recovery results of the salt passage and salt rejection are 31.11%, 19.05%, and 80.95%, correspondingly. Therefore, the freeze-melting process is encouraging for the proposed application, as it shows on the results, which approves the process capability of reducing a major amount of the dissolved salts of the high saline brine with reasonable sensible recovery. This process might be reasonable with other brine disposal processes.

Keywords: high saline brine, freeze-melting process, ice crystallization, brine disposal process

Procedia PDF Downloads 263
16479 Feasibility of Voluntary Deep Inspiration Breath-Hold Radiotherapy Technique Implementation without Deep Inspiration Breath-Hold-Assisting Device

Authors: Auwal Abubakar, Shazril Imran Shaukat, Noor Khairiah A. Karim, Mohammed Zakir Kassim, Gokula Kumar Appalanaido, Hafiz Mohd Zin

Abstract:

Background: Voluntary deep inspiration breath-hold radiotherapy (vDIBH-RT) is an effective cardiac dose reduction technique during left breast radiotherapy. This study aimed to assess the accuracy of the implementation of the vDIBH technique among left breast cancer patients without the use of a special device such as a surface-guided imaging system. Methods: The vDIBH-RT technique was implemented among thirteen (13) left breast cancer patients at the Advanced Medical and Dental Institute (AMDI), Universiti Sains Malaysia. Breath-hold monitoring was performed based on breath-hold skin marks and laser light congruence observed on zoomed CCTV images from the control console during each delivery. The initial setup was verified using cone beam computed tomography (CBCT) during breath-hold. Each field was delivered using multiple beam segments to allow a delivery time of 20 seconds, which can be tolerated by patients in breath-hold. The data were analysed using an in-house developed MATLAB algorithm. PTV margin was computed based on van Herk's margin recipe. Results: The setup error analysed from CBCT shows that the population systematic error in lateral (x), longitudinal (y), and vertical (z) axes was 2.28 mm, 3.35 mm, and 3.10 mm, respectively. Based on the CBCT image guidance, the Planning target volume (PTV) margin that would be required for vDIBH-RT using CCTV/Laser monitoring technique is 7.77 mm, 10.85 mm, and 10.93 mm in x, y, and z axes, respectively. Conclusion: It is feasible to safely implement vDIBH-RT among left breast cancer patients without special equipment. The breath-hold monitoring technique is cost-effective, radiation-free, easy to implement, and allows real-time breath-hold monitoring.

Keywords: vDIBH, cone beam computed tomography, radiotherapy, left breast cancer

Procedia PDF Downloads 48
16478 Analysis and Design of Offshore Triceratops under Ultra-Deep Waters

Authors: Srinivasan Chandrasekaran, R. Nagavinothini

Abstract:

Offshore platforms for ultra-deep waters are form-dominant by design; hybrid systems with large flexibility in horizontal plane and high rigidity in vertical plane are preferred due to functional complexities. Offshore triceratops is relatively a new-generation offshore platform, whose deck is partially isolated from the supporting buoyant legs by ball joints. They allow transfer of partial displacements of buoyant legs to the deck but restrain transfer of rotational response. Buoyant legs are in turn taut-moored to the sea bed using pre-tension tethers. Present study will discuss detailed dynamic analysis and preliminary design of the chosen geometric, which is necessary as a proof of validation for such design applications. A detailed numeric analysis of triceratops at 2400 m water depth under random waves is presented. Preliminary design confirms member-level design requirements under various modes of failure. Tether configuration, proposed in the study confirms no pull-out of tethers as stress variation is comparatively lesser than the yield value. Presented study shall aid offshore engineers and contractors to understand suitability of triceratops, in terms of design and dynamic response behaviour.

Keywords: offshore structures, triceratops, random waves, buoyant legs, preliminary design, dynamic analysis

Procedia PDF Downloads 202
16477 Evaluation of Reliability, Availability and Maintainability for Automotive Manufacturing Process

Authors: Hamzeh Soltanali, Abbas Rohani, A. H. S. Garmabaki, Mohammad Hossein Abbaspour-Fard, Adithya Thaduri

Abstract:

Toward continuous innovation and high complexity of technological systems, the automotive manufacturing industry is also under pressure to implement adequate management strategies regarding availability and productivity. In this context, evaluation of system’s performance by considering reliability, availability and maintainability (RAM) methodologies can constitute for resilient operation, identifying the bottlenecks of manufacturing process and optimization of maintenance actions. In this paper, RAM parameters are evaluated for improving the operational performance of the fluid filling process. To evaluate the RAM factors through the behavior of states defined for such process, a systematic decision framework was developed. The results of RAM analysis revealed that that the improving reliability and maintainability of main bottlenecks for each filling workstation need to be considered as a priority. The results could be useful to improve operational performance and sustainability of production process.

Keywords: automotive, performance, reliability, RAM, fluid filling process

Procedia PDF Downloads 350
16476 Speed Breaker/Pothole Detection Using Hidden Markov Models: A Deep Learning Approach

Authors: Surajit Chakrabarty, Piyush Chauhan, Subhasis Panda, Sujoy Bhattacharya

Abstract:

A large proportion of roads in India are not well maintained as per the laid down public safety guidelines leading to loss of direction control and fatal accidents. We propose a technique to detect speed breakers and potholes using mobile sensor data captured from multiple vehicles and provide a profile of the road. This would, in turn, help in monitoring roads and revolutionize digital maps. Incorporating randomness in the model formulation for detection of speed breakers and potholes is crucial due to substantial heterogeneity observed in data obtained using a mobile application from multiple vehicles driven by different drivers. This is accomplished with Hidden Markov Models, whose hidden state sequence is found for each time step given the observables sequence, and are then fed as input to LSTM network with peephole connections. A precision score of 0.96 and 0.63 is obtained for classifying bumps and potholes, respectively, a significant improvement from the machine learning based models. Further visualization of bumps/potholes is done by converting time series to images using Markov Transition Fields where a significant demarcation among bump/potholes is observed.

Keywords: deep learning, hidden Markov model, pothole, speed breaker

Procedia PDF Downloads 140
16475 Settlement of the Foundation on the Improved Soil: A Case Study

Authors: Morteza Karami, Soheila Dayani

Abstract:

Deep Soil Mixing (DSM) is a soil improvement technique that involves mechanically mixing the soil with a binder material to improve its strength, stiffness, and durability. This technique is typically used in geotechnical engineering applications where weak or unstable soil conditions exist, such as in building foundations, embankment support, or ground improvement projects. In this study, the settlement of the foundation on the improved soil using the wet DSM technique has been analyzed for a case study. Before DSM production, the initial soil mixture has been determined based on the laboratory tests and then, the proper mix designs have been optimized based on the pilot scale tests. The results show that the spacing and depth of the DSM columns depend on the soil properties, the intended loading conditions, and other factors such as the available space and equipment limitations. Moreover, monitoring instruments installed in the pilot area verify that the settlement of the foundation has been placed in an acceptable range to ensure that the soil mixture is providing the required strength and stiffness to support the structure or load. As an important result, if the DSM columns touch or penetrate into the stiff soil layer, the settlement of the foundation can be significantly decreased. Furthermore, the DSM columns should be allowed to cure sufficiently before placing any significant loads on the structure to prevent excessive deformation or settlement.

Keywords: deep soil mixing, soil mixture, settlement, instrumentation, curing age

Procedia PDF Downloads 78
16474 The Impact of Artificial Intelligence in the Development of Textile and Fashion Industry

Authors: Basem Kamal Abasakhiroun Farag

Abstract:

Fashion, like many other areas of design, has undergone numerous developments over the centuries. The aim of the article is to recognize and evaluate the importance of advanced technologies in fashion design and to examine how they are transforming the role of contemporary fashion designers by transforming the creative process. It also discusses how contemporary culture is involved in such developments and how it influences fashion design in terms of conceptualization and production. The methodology used is based on examining various examples of the use of technology in fashion design and drawing parallels between what was feasible then and what is feasible today. Comparison of case studies, examples of existing fashion designs and experiences with craft methods; We therefore observe patterns that help us predict the direction of future developments in this area. Discussing the technological elements in fashion design helps us understand the driving force behind the trend. The research presented in the article shows that there is a trend towards significantly increasing interest and progress in the field of fashion technology, leading to the emergence of hybrid artisanal methods. In summary, as fashion technologies advance, their role in clothing production is becoming increasingly important, extending far beyond the humble sewing machine.

Keywords: fashion, identity, such, textiles ambient intelligence, proximity sensors, shape memory materials, sound sensing garments, wearable technology bio textiles, fashion trends, nano textiles, new materials, smart textiles, techno textiles fashion design, functional aesthetics, 3D printing.

Procedia PDF Downloads 60
16473 Entrepreneurship Education Revised: Merging a Theory-Based and Action-Based Framework for Entrepreneurial Narratives' Impact as an Awareness-Raising Teaching Tool

Authors: Katharina Fellnhofer, Kaisu Puumalainen

Abstract:

Despite the current worldwide increasing interest in entrepreneurship education (EE), little attention has been paid to innovative web-based ways such as the narrative approach by telling individual stories of entrepreneurs via multimedia for demonstrating the impact on individuals towards entrepreneurship. In addition, this research discipline is faced with no consensus regarding its effective content of teaching materials and tools. Therefore, a qualitative hypothesis-generating research contribution is required to aim at drawing new insights from published works in the EE field of research to serve for future research related to multimedia entrepreneurial narratives. Based on this background, our effort will focus on finding support regarding following introductory statement: Multimedia success and failure stories of real entrepreneurs show potential to change perceptions towards entrepreneurship in a positive way. The proposed qualitative conceptual paper will introduce the underlying background for this research framework. Therefore, as a qualitative hypothesis-generating research contribution it aims at drawing new insights from published works in the EE field of research related to entrepreneurial narratives to serve for future research. With the means of the triangulation of multiple theories, we will utilize the foundation for multimedia-based entrepreneurial narratives applying a learning-through-multimedia-real-entrepreneurial-narratives pedagogical tool to facilitate entrepreneurship. Our effort will help to demystify how value-oriented entrepreneurs telling their stories multimedia can simultaneously enhance EE. Therefore, the paper will build new-fangled bridges between well-cited theoretical constructs to build a robust research framework. Overall, the intended contribution seeks to emphasize future research of currently under-researched issues in the EE sphere, which are considered to be essential not only to academia, as well as to business and society having future jobs-providing growth-oriented entrepreneurs in mind. The Authors would like to thank the Austrian Science Fund FWF: [J3740 – G27].

Keywords: entrepreneurship education, entrepreneurial attitudes and perceptions, entrepreneurial intention, entrepreneurial narratives

Procedia PDF Downloads 253
16472 Elevated Temperature Shot Peening for M50 Steel

Authors: Xinxin Ma, Guangze Tang, Shuxin Yang, Jinguang He, Fan Zhang, Peiling Sun, Ming Liu, Minyu Sun, Liqin Wang

Abstract:

As a traditional surface hardening technique, shot peening is widely used in industry. By using shot peening, a residual compressive stress is formed in the surface which is beneficial for improving the fatigue life of metal materials. At the same time, very fine grains and high density defects are generated in the surface layer which enhances the surface hardness, either. However, most of the processes are carried out at room temperature. For high strength steel, such as M50, the thickness of the strengthen layer is limited. In order to obtain a thick strengthen surface layer, elevated temperature shot peening was carried out in this work by using Φ1mm cast ion balls with a speed of 80m/s. Considering the tempering temperature of M50 steel is about 550 oC, the processing temperature was in the range from 300 to 500 oC. The effect of processing temperature and processing time of shot peening on distribution of residual stress and surface hardness was investigated. As we known, the working temperature of M50 steel can be as high as 315 oC. Because the defects formed by shot peening are unstable when the working temperature goes higher, it is worthy to understand what happens during the shot peening process, and what happens when the strengthen samples were kept at a certain temperature. In our work, the shot peening time was selected from 2 to 10 min. And after the strengthening process, the samples were annealed at various temperatures from 200 to 500 oC up to 60 h. The results show that the maximum residual compressive stress is near 900 MPa. Compared with room temperature shot peening, the strengthening depth of 500 oC shot peening sample is about 2 times deep. The surface hardness increased with the processing temperature, and the saturation peening time decreases. After annealing, the residual compressive stress decreases, however, for 500 oC peening sample, even annealing at 500 oC for 20 h, the residual compressive stress is still over 600 MPa. However, it is clean to see from SEM that the grain size of surface layers is still very small.

Keywords: shot peening, M50 steel, residual compressive stress, elevated temperature

Procedia PDF Downloads 451
16471 A Three-Step Iterative Process for Common Fixed Points of Three Contractive-Like Operators

Authors: Safeer Hussain Khan, H. Fukhar-ud-Din

Abstract:

The concept of quasi-contractive type operators was given by Berinde and extended by Imoru and Olatinwo. They named this new type as contractive-like operators. On the other hand, Xu and Noo introduced a three-step-one-mappings iterative process which can be seen as a generalization of Mann and Ishikawa iterative processes. Approximating common fixed points has its own importance as it has a direct link with minimization problem. Motivated by this, in this paper, we first extend the iterative process of Xu and Noor to the case of three-step-three-mappings and then prove a strong convergence result using contractive-like operators for this iterative process. In general, this generalizes corresponding results using Mann, Ishikawa and Xu-Noor iterative processes with quasi-contractive type operators. It is to be pointed out that our results can also be proved with iterative process involving error terms.

Keywords: contractive-like operator, iterative process, common fixed point, strong convergence

Procedia PDF Downloads 587
16470 Requirements Management in Agile

Authors: Ravneet Kaur

Abstract:

The concept of Agile Requirements Engineering and Management is not new. However, the struggle to figure out how traditional Requirements Management Process fits within an Agile framework remains complex. This paper talks about a process that can merge the organization’s traditional Requirements Management Process nicely into the Agile Software Development Process. This process provides Traceability of the Product Backlog to the external documents on one hand and User Stories on the other hand. It also gives sufficient evidence that the system will deliver the right functionality with good quality in the form of various statistics and reports. In the nutshell, by overlaying a process on top of Agile, without disturbing the Agility, we are able to get synergic benefits in terms of productivity, profitability, its reporting, and end to end visibility to all Stakeholders. The framework can be used for just-in-time requirements definition or to build a repository of requirements for future use. The goal is to make sure that the business (specifically, the product owner) can clearly articulate what needs to be built and define what is of high quality. To accomplish this, the requirements cycle follows a Scrum-like process that mirrors the development cycle but stays two to three steps ahead. The goal is to create a process by which requirements can be thoroughly vetted, organized, and communicated in a manner that is iterative, timely, and quality-focused. Agile is quickly becoming the most popular way of developing software because it fosters continuous improvement, time-boxed development cycles, and more quickly delivering value to the end users. That value will be driven to a large extent by the quality and clarity of requirements that feed the software development process. An agile, lean, and timely approach to requirements as the starting point will help to ensure that the process is optimized.

Keywords: requirements management, Agile

Procedia PDF Downloads 365
16469 Embedded Visual Perception for Autonomous Agricultural Machines Using Lightweight Convolutional Neural Networks

Authors: René A. Sørensen, Søren Skovsen, Peter Christiansen, Henrik Karstoft

Abstract:

Autonomous agricultural machines act in stochastic surroundings and therefore, must be able to perceive the surroundings in real time. This perception can be achieved using image sensors combined with advanced machine learning, in particular Deep Learning. Deep convolutional neural networks excel in labeling and perceiving color images and since the cost of high-quality RGB-cameras is low, the hardware cost of good perception depends heavily on memory and computation power. This paper investigates the possibility of designing lightweight convolutional neural networks for semantic segmentation (pixel wise classification) with reduced hardware requirements, to allow for embedded usage in autonomous agricultural machines. Using compression techniques, a lightweight convolutional neural network is designed to perform real-time semantic segmentation on an embedded platform. The network is trained on two large datasets, ImageNet and Pascal Context, to recognize up to 400 individual classes. The 400 classes are remapped into agricultural superclasses (e.g. human, animal, sky, road, field, shelterbelt and obstacle) and the ability to provide accurate real-time perception of agricultural surroundings is studied. The network is applied to the case of autonomous grass mowing using the NVIDIA Tegra X1 embedded platform. Feeding case-specific images to the network results in a fully segmented map of the superclasses in the image. As the network is still being designed and optimized, only a qualitative analysis of the method is complete at the abstract submission deadline. Proceeding this deadline, the finalized design is quantitatively evaluated on 20 annotated grass mowing images. Lightweight convolutional neural networks for semantic segmentation can be implemented on an embedded platform and show competitive performance with regards to accuracy and speed. It is feasible to provide cost-efficient perceptive capabilities related to semantic segmentation for autonomous agricultural machines.

Keywords: autonomous agricultural machines, deep learning, safety, visual perception

Procedia PDF Downloads 390
16468 From Poverty to Progress: A Comparative Analysis of Mongolia with PEER Countries

Authors: Yude Wu

Abstract:

Mongolia, grappling with significant socio-economic challenges, faces pressing issues of inequality and poverty, as evidenced by a high Gini coefficient and the highest poverty rate among the top 20 largest Asian countries. Despite government efforts, Mongolia's poverty rate experienced only a slight reduction from 29.6 percent in 2016 to 27.8 percent in 2020. PEER countries, such as South Africa, Botswana, Kazakhstan, and Peru, share characteristics with Mongolia, including reliance on the mining industry and classification as lower middle-income countries. Successful transitions of these countries to upper middle-income status between 1994 and the 2010s provide valuable insights. Drawing on secondary analyses of existing research and PEER country profiles, the study evaluates past policies, identifies gaps in current approaches, and proposes recommendations to combat poverty sustainably. The hypothesis includes a reliance on the mining industry and a transition from lower to upper middle-income status. Policies from these countries, such as the GEAR policy in South Africa and economic diversification in Botswana, offer insights into Mongolia's development. This essay aims to illuminate the multidimensional nature of underdevelopment in Mongolia through a secondary analysis of existing research and PEER country profiles, evaluating past policies, identifying gaps in current approaches, and providing recommendations for sustainable progress. Drawing inspiration from PEER countries, Mongolia can implement policies such as economic diversification to reduce vulnerability and create stable job opportunities. Emphasis on infrastructure, human capital, and strategic partnerships for Foreign Direct Investment (FDI) aligns with successful strategies implemented by PEER countries, providing a roadmap for Mongolia's development objectives.

Keywords: inequality, PEER countries, comparative analysis, nomadic animal husbandry, sustainable growth

Procedia PDF Downloads 61
16467 A Study on Stochastic Integral Associated with Catastrophes

Authors: M. Reni Sagayaraj, S. Anand Gnana Selvam, R. Reynald Susainathan

Abstract:

We analyze stochastic integrals associated with a mutation process. To be specific, we describe the cell population process and derive the differential equations for the joint generating functions for the number of mutants and their integrals in generating functions and their applications. We obtain first-order moments of the processes of the two-way mutation process in first-order moment structure of X (t) and Y (t) and the second-order moments of a one-way mutation process. In this paper, we obtain the limiting behaviour of the integrals in limiting distributions of X (t) and Y (t).

Keywords: stochastic integrals, single–server queue model, catastrophes, busy period

Procedia PDF Downloads 639
16466 Innovation in PhD Training in the Interdisciplinary Research Institute

Authors: B. Shaw, K. Doherty

Abstract:

The Cultural Communication and Computing Research Institute (C3RI) is a diverse multidisciplinary research institute including art, design, media production, communication studies, computing and engineering. Across these disciplines it can seem like there are enormous differences of research practice and convention, including differing positions on objectivity and subjectivity, certainty and evidence, and different political and ethical parameters. These differences sit within, often unacknowledged, histories, codes, and communication styles of specific disciplines, and it is all these aspects that can make understanding of research practice across disciplines difficult. To explore this, a one day event was orchestrated, testing how a PhD community might communicate and share research in progress in a multi-disciplinary context. Instead of presenting results at a conference, research students were tasked to articulate their method of inquiry. A working party of students from across disciplines had to design a conference call, visual identity and an event framework that would work for students across all disciplines. The process of establishing the shape and identity of the conference was revealing. Even finding a linguistic frame that would meet the expectations of different disciplines for the conference call was challenging. The first abstracts submitted either resorted to reporting findings, or only described method briefly. It took several weeks of supported intervention for research students to get ‘inside’ their method and to understand their research practice as a process rich with philosophical and practical decisions and implications. In response to the abstracts the conference committee generated key methodological categories for conference sessions, including sampling, capturing ‘experience’, ‘making models’, researcher identities, and ‘constructing data’. Each session involved presentations by visual artists, communications students and computing researchers with inter-disciplinary dialogue, facilitated by alumni Chairs. The apparently simple focus on method illuminated research process as a site of creativity, innovation and discovery, and also built epistemological awareness, drawing attention to what is being researched and how it can be known. It was surprisingly difficult to limit students to discussing method, and it was apparent that the vocabulary available for method is sometimes limited. However, by focusing on method rather than results, the genuine process of research, rather than one constructed for approval, could be captured. In unlocking the twists and turns of planning and implementing research, and the impact of circumstance and contingency, students had to reflect frankly on successes and failures. This level of self – and public- critique emphasised the degree of critical thinking and rigour required in executing research and demonstrated that honest reportage of research, faults and all, is good valid research. The process also revealed the degree that disciplines can learn from each other- the computing students gained insights from the sensitive social contextualizing generated by communications and art and design students, and art and design students gained understanding from the greater ‘distance’ and emphasis on application that computing students applied to their subjects. Finding the means to develop dialogue across disciplines makes researchers better equipped to devise and tackle research problems across disciplines, potentially laying the ground for more effective collaboration.

Keywords: interdisciplinary, method, research student, training

Procedia PDF Downloads 203
16465 Process Mining as an Ecosystem Platform to Mitigate a Deficiency of Processes Modelling

Authors: Yusra Abdulsalam Alqamati, Ahmed Alkilany

Abstract:

The teaching staff is a distinct group whose impact is on the educational process and which plays an important role in enhancing the quality of the academic education process. To improve the management effectiveness of the academy, the Teaching Staff Management System (TSMS) proposes that all teacher processes be digitized. Since the BPMN approach can accurately describe the processes, it lacks a clear picture of the process flow map, something that the process mining approach has, which is extracting information from event logs for discovery, monitoring, and model enhancement. Therefore, these two methodologies were combined to create the most accurate representation of system operations, the ability to extract data records and mining processes, recreate them in the form of a Petri net, and then generate them in a BPMN model for a more in-depth view of process flow. Additionally, the TSMS processes will be orchestrated to handle all requests in a guaranteed small-time manner thanks to the integration of the Google Cloud Platform (GCP), the BPM engine, and allowing business owners to take part throughout the entire TSMS project development lifecycle.

Keywords: process mining, BPM, business process model and notation, Petri net, teaching staff, Google Cloud Platform

Procedia PDF Downloads 136
16464 The Effect of Oxidation Stability Improvement in Calophyllum Inophyllum Palm Oil Methyl Ester Production

Authors: Natalina, Hwai Chyuan Onga, W. T. Chonga

Abstract:

Oxidation stability of biodiesel is very important in fuel handling especially for remote location of biodiesel application. Variety of feedstocks and biodiesel production process resulted many variation of biodiesel oxidation stability. The current study relates to investigation of the impact of fatty acid composition that caused by natural and production process of calophyllum inophyllum palm oil methyl ester that correlated with improvement of biodiesel oxidation stability. Firstly, biodiesel was produced from crude oil of palm oil, calophyllum inophyllum and mixing of calophyllum inophyllum and palm oil. The production process of calophyllum inophyllum palm oil methyl ester (CIPOME) was divided by including washing process and without washing. Secondly, the oxidation stability was measured from the palm oil methyl ester (POME), calophyllum inophyllum methyl ester (CIME), CIPOME with washing process and CIPOME without washing process. Then, in order to find the differences of fatty acid compositions all of the biodiesels were measured by gas chromatography analysis. It was found that mixing calophyllum inophyllum into palm oil increased the oxidation stability. Washing process influenced the CIPOME fatty acid composition, and reduction of washing process during the production process gave significant oxidation stability number of CIPOME (38 h to 114 h).

Keywords: biodiesel, oxidation stability, calophyllum inophyllum, water content

Procedia PDF Downloads 267