Search results for: welding process selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17044

Search results for: welding process selection

16114 A Conceptual Design of Freeze Desalination Using Low Cost Refrigeration

Authors: Parul Sahu

Abstract:

In recent years, seawater desalination has been emerged as a potential resource to circumvent water scarcity, especially in coastal regions. Among the various methods, thermal evaporation or distillation and membrane operations like Reverse Osmosis (RO) has been exploited at commercial scale. However, the energy cost and maintenance expenses associated with these processes remain high. In this context Freeze Desalination (FD), subjected to the availability of low cost refrigeration, offers an exciting alternative. Liquefied Natural Gas (LNG) regasification terminals provide an opportunity to utilize the refrigeration available with regasification of LNG. This work presents the conceptualization and development of a process scheme integrating the ice and hydrate based FD to the LNG regasification process. This integration overcomes the high energy demand associated with FD processes by utilizing the refrigeration associated with LNG regasification. An optimal process scheme was obtained by performing process simulation using ASPEN PLUS simulator. The results indicated the new proposed process requires only 1 kWh/m³ of energy with the utilization of maximum refrigeration. In addition, a sensitivity analysis was also performed to study the effect of various process parameters on water recovery and energy consumption for the proposed process. The results show that the energy consumption decreases by 30% with an increase in water recovery from 30% to 60%. However, due to operational limitations associated with ice and hydrate handling in seawater, the water recovery cannot be maximized but optimized. The proposed process can be potentially used to desalinate seawater in integration with LNG regasification terminal.

Keywords: freeze desalination, liquefied natural gas regasification, process simulation, refrigeration

Procedia PDF Downloads 131
16113 Centralizing the Teaching Process in Intelligent Tutoring System Architectures

Authors: Nikolaj Troels Graf Von Malotky, Robin Nicolay, Alke Martens

Abstract:

There exist a plethora of architectures for ITSs (Intelligent Tutoring Systems). A thorough analysis and comparison of the architectures revealed, that in most cases the architecture extensions are evolutionary grown, reflecting state of the art trends of each decade. However, from the perspective of software engineering, the main aspect of an ITS has not been reflected in any of these architectures, yet. From the perspective of cognitive research, the construction of the teaching process is what makes an ITS 'intelligent' regarding the spectrum of interaction with the students. Thus, in our approach, we focus on a behavior based architecture, which is based on the main teaching processes. To create a new general architecture for ITS, we have to define the prerequisites. This paper analyzes the current state of the existing architectures and derives rules for the behavior of ITS. It is presenting a teaching process for ITSs to be used together with the architecture.

Keywords: intelligent tutoring, ITS, tutoring process, system architecture, interaction process

Procedia PDF Downloads 384
16112 Time and Energy Saving Kitchen Layout

Authors: Poonam Magu, Kumud Khanna, Premavathy Seetharaman

Abstract:

The two important resources of any worker performing any type of work at any workplace are time and energy. These are important inputs of the worker and need to be utilised in the best possible manner. The kitchen is an important workplace where the homemaker performs many essential activities. Its layout should be so designed that optimum use of her resources can be achieved.Ideally, the shape of the kitchen, as determined by the physical space enclosed by the four walls, can be square, rectangular or irregular. But it is the shape of the arrangement of counter that one normally refers to while talking of the layout of the kitchen. The arrangement can be along a single wall, along two opposite walls, L shape, U shape or even island. A study was conducted in 50 kitchens belonging to middle income group families. These were DDA built kitchens located in North, South, East and West Delhi.The study was conducted in three phases. In the first phase, 510 non working homemakers were interviewed. The data related to personal characteristics of the homemakers was collected. Additional information was also collected regarding the kitchens-the size, shape , etc. The homemakers were also questioned about various aspects related to meal preparation-people performing the task, number of items cooked, areas used for meal preparation , etc. In the second phase, a suitable technique was designed for conducting time and motion study in the kitchen while the meal was being prepared. This technique was called Path Process Chart. The final phase was carried out in 50 kitchens. The criterion for selection was that all items for a meal should be cooked at the same time. All the meals were cooked by the homemakers in their own kitchens. The meal preparation was studied using the Path Process Chart technique. The data collected was analysed and conclusions drawn. It was found that of all the shapes, it was the kitchen with L shape arrangement in which, on an average a homemaker spent minimum time on meal preparation and also travelled the minimum distance. Thus, the average distance travelled in a L shaped layout was 131.1 mts as compared to 181.2 mts in an U shaped layout. Similarly, 48 minutes was the average time spent on meal preparation in L shaped layout as compared to 53 minutes in U shaped layout. Thus, the L shaped layout was more time and energy saving layout as compared to U shaped.

Keywords: kitchen layout, meal preparation, path process chart technique, workplace

Procedia PDF Downloads 206
16111 Value in Exchange: The Importance of Users Interaction as the Center of User Experiences

Authors: Ramlan Jantan, Norfadilah Kamaruddin, Shahriman Zainal Abidin

Abstract:

In this era of technology, the co-creation method has become a new development trend. In this light, most design businesses have currently transformed their development strategy from being goods-dominant into service-dominant where more attention is given to the end-users and their roles in the development process. As a result, the conventional development process has been replaced with a more cooperative one. Consequently, numerous studies have been conducted to explore the extension of co-creation method in the design development process and most studies have focused on issues found during the production process. In the meantime, this study aims to investigate potential values established during the pre-production process, which is also known as the ‘circumstances value creation’. User involvement is questioned and crucially debate at the entry level of pre-production process in value in-exchange jointly spheres; thus user experiences took place. Thus, this paper proposed a potential framework of the co-creation method for Malaysian interactive product development. The framework is formulated from both parties involved: the users and designers. The framework will clearly give an explanation of the value of the co-creation method, and it could assist relevant design industries/companies in developing a blueprint for the design process. This paper further contributes to the literature on the co-creation of value and digital ecosystems.

Keywords: co-creation method, co-creation framework, co-creation, co-production

Procedia PDF Downloads 178
16110 Automatic Detection of Defects in Ornamental Limestone Using Wavelets

Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas

Abstract:

A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.

Keywords: automatic detection, defects, fracture lines, wavelets

Procedia PDF Downloads 247
16109 Reduce, Reuse and Recycle: Grand Challenges in Construction Recovery Process

Authors: Abioye A. Oyenuga, Rao Bhamidiarri

Abstract:

Hurling a successful Construction and Demolition Waste (C&DW) recycling operation around the globe is a challenge today, predominantly because secondary materials markets are yet to be integrated. Reducing, Reusing and recycling of (C&DW) have been employed over the years, and various techniques have been investigated. However, the economic and environmental viability of its application seems limited. This paper discusses the costs and benefits in using secondary materials and focus on investigating reuse and recycling process for five major types of construction materials: concrete, metal, wood, cardboard/paper, and plasterboard. Data obtained from demolition specialist and contractors are considered and evaluated. With the date source, the research paper found that construction material recovery process fully incorporate the 3R’s process and shows how energy recovery by means of 3R's principles can be evaluated. This scrutiny leads to the empathy of grand challenges in construction material recovery process. Recommendations to deepen material recovery process are also discussed.

Keywords: construction and demolition waste (C&DW), 3R concept, recycling, reuse, waste management, UK

Procedia PDF Downloads 428
16108 Extremophilic Amylases of Mycelial Fungi Strains Isolated in South Caucasus for Starch Processing

Authors: T. Urushadze, R. Khvedelidze, L. Kutateladze, M. Jobava, T. Burduli, T. Alexidze

Abstract:

There is an increasing interest in reliable, wasteless, ecologically friendly technologies. About 40% of enzymes produced all over the world are used for production of syrups with high concentration of glucose-fructose. One of such technologies complies obtaining fermentable sugar glucose from raw materials containing starch by means of amylases. In modern alcohol-producing factories this process is running in two steps, involving two enzymes of different origin: bacterial α-amylase and fungal glucoamylase, as generally fungal amylases are less thermostable as compared to bacterial amylases. Selection of stable and operable at 700С and higher temperatures enzyme preparation with both α- and glucoamylase activities will allow conducting this process in one step. S. Durmishidze Institute of Biochemistry and Biotechnology owns unique collection of mycelial fungi, isolated from different ecological niches of Caucasus. As a result of screening our collection 39 strains poducing amylases were revealed. Most of them belong to the genus Aspergillus. Optimum temperatures of action of selected amylases from three producers were estableshed to be within the range 67-80°C. A. niger B-6 showed higher α-amylase activity at 67°C, and glucoamylase activity at 62°C, A. niger 6-12 showed higher α-amylase activity at 72°C, and glucoamylase activity at 65°C, Aspergillus niger p8-3 showed higher activities at 82°C and 70°C, for α-amylase and glucoamylase activities, respectively. Exhaustive hydrolysis process of starch solutions of different concentrations (3, 5, 15, and 30 %) with cultural liquid and technical preparation of Aspergillus niger p8-3 enzyme was studied. In case of low concentrations exhaustive hydrolysis of starch lasts 40–60 minutes, in case of high concentrations hydrolysis takes longer time. 98, 6% yield of glucose can be reached at incubation during 12 hours with enzyme cultural liquid and 8 hours incubation with technical preparation of the enzyme at gradual increase of temperature from 50°C to 82°C during the first 20 minutes and further decrease of temperature to 70°C. Temperature setting for high yield of glucose and high hydrolysis (pasteurizing), optimal for activity of these strains is the prerequisite to be able to carry out hydrolysis of starch to glucose in one step, and consequently, using one strain, what will be economically justified.

Keywords: amylase, glucose hydrolisis, stability, starch

Procedia PDF Downloads 350
16107 Input-Output Analysis in Laptop Computer Manufacturing

Authors: H. Z. Ulukan, E. Demircioğlu, M. Erol Genevois

Abstract:

The scope of this paper and the aim of proposed model were to apply monetary Input –Output (I-O) analysis to point out the importance of reusing know-how and other requirements in order to reduce the production costs in a manufacturing process for a laptop computer. I-O approach using the monetary input-output model is employed to demonstrate the impacts of different factors in a manufacturing process. A sensitivity analysis showing the correlation between these different factors is also presented. It is expected that the recommended model would have an advantageous effect in the cost minimization process.

Keywords: input-output analysis, monetary input-output model, manufacturing process, laptop computer

Procedia PDF Downloads 391
16106 The Effects of Transformational Leadership on Process Innovation through Knowledge Sharing

Authors: Sawsan J. Al-Husseini, Talib A. Dosa

Abstract:

Transformational leadership has been identified as the most important factor affecting innovation and knowledge sharing; it leads to increased goal-directed behavior exhibited by followers and thus to enhanced performance and innovation for the organization. However, there is a lack of models linking transformational leadership, knowledge sharing, and process innovation within higher education (HE) institutions in general within developing countries, particularly in Iraq. This research aims to examine the mediating role of knowledge sharing in the transformational leadership and process innovation relationship. A quantitative approach was taken and 254 usable questionnaires were collected from public HE institutions in Iraq. Structural equation modelling with AMOS 22 was used to analyze the causal relationships among factors. The research found that knowledge sharing plays a pivotal role in the relationship between transformational leadership and process innovation, and that transformational leadership would be ideal in an educational context, promoting knowledge sharing activities and influencing process innovation in the public HE in Iraq. The research has developed some guidelines for researchers as well as leaders and provided evidence to support the use of TL to increase process innovation within HE environment in developing countries, particularly in Iraq.

Keywords: transformational leadership, knowledge sharing, process innovation, structural equation modelling, developing countries

Procedia PDF Downloads 335
16105 Enhancing Higher Education Teaching and Learning Processes: Examining How Lecturer Evaluation Make a Difference

Authors: Daniel Asiamah Ameyaw

Abstract:

This research attempts to investigate how lecturer evaluation makes a difference in enhancing higher education teaching and learning processes. The research questions to guide this research work states first as, “What are the perspectives on the difference made by evaluating academic teachers in order to enhance higher education teaching and learning processes?” and second, “What are the implications of the findings for Policy and Practice?” Data for this research was collected mainly through interviewing and partly documents review. Data analysis was conducted under the framework of grounded theory. The findings showed that for individual lecturer level, lecturer evaluation provides a continuous improvement of teaching strategies, and serves as source of data for research on teaching. At the individual student level, it enhances students learning process; serving as source of information for course selection by students; and by making students feel recognised in the educational process. At the institutional level, it noted that lecturer evaluation is useful in personnel and management decision making; it assures stakeholders of quality teaching and learning by setting up standards for lecturers; and it enables institutions to identify skill requirement and needs as a basis for organising workshops. Lecturer evaluation is useful at national level in terms of guaranteeing the competencies of graduates who then provide the needed manpower requirement of the nation. Besides, it mentioned that resource allocation to higher educational institution is based largely on quality of the programmes being run by the institution. The researcher concluded, that the findings have implications for policy and practice, therefore, higher education managers are expected to ensure that policy is implemented as planned by policy-makers so that the objectives can successfully be achieved.

Keywords: academic quality, higher education, lecturer evaluation, teaching and learning processes

Procedia PDF Downloads 143
16104 Determining the Most Efficient Test Available in Software Testing

Authors: Qasim Zafar, Matthew Anderson, Esteban Garcia, Steven Drager

Abstract:

Software failures can present an enormous detriment to people's lives and cost millions of dollars to repair when they are unexpectedly encountered in the wild. Despite a significant portion of the software development lifecycle and resources are dedicated to testing, software failures are a relatively frequent occurrence. Nevertheless, the evaluation of testing effectiveness remains at the forefront of ensuring high-quality software and software metrics play a critical role in providing valuable insights into quantifiable objectives to assess the level of assurance and confidence in the system. As the selection of appropriate metrics can be an arduous process, the goal of this paper is to shed light on the significance of software metrics by examining a range of testing techniques and metrics as well as identifying key areas for improvement. Additionally, through this investigation, readers will gain a deeper understanding of how metrics can help to drive informed decision-making on delivering high-quality software and facilitate continuous improvement in testing practices.

Keywords: software testing, software metrics, testing effectiveness, black box testing, random testing, adaptive random testing, combinatorial testing, fuzz testing, equivalence partition, boundary value analysis, white box testing

Procedia PDF Downloads 89
16103 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization

Authors: Soheila Sadeghi

Abstract:

Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.

Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction

Procedia PDF Downloads 59
16102 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps

Authors: Yong Bum Shin

Abstract:

This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.

Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process

Procedia PDF Downloads 81
16101 A Clustering Algorithm for Massive Texts

Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen

Abstract:

Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.

Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process

Procedia PDF Downloads 435
16100 Application Potential of Forward Osmosis-Nanofiltration Hybrid Process for the Treatment of Mining Waste Water

Authors: Ketan Mahawer, Abeer Mutto, S. K. Gupta

Abstract:

The mining wastewater contains inorganic metal salts, which makes it saline and additionally contributes to contaminating the surface and underground freshwater reserves that exist nearby mineral processing industries. Therefore, treatment of wastewater and water recovery is obligatory by any available technology before disposing it into the environment. Currently, reverse osmosis (RO) is the commercially acceptable conventional membrane process for saline wastewater treatment, but consumes an enormous amount of energy and makes the process expensive. To solve this industrial problem with minimum energy consumption, we tested the feasibility of forward osmosis-nanofiltration (FO-NF) hybrid process for the mining wastewater treatment. The FO-NF process experimental results for 0.029M concentration of saline wastewater treated by 0.42 M sodium-sulfate based draw solution shows that specific energy consumption of the FO-NF process compared with standalone NF was slightly above (between 0.5-1 kWh/m3) from conventional process. However, average freshwater recovery was 30% more from standalone NF with same feed and operating conditions. Hence, FO-NF process in place of RO/NF offers a huge possibility for treating mining industry wastewater and concentrates the metals as the by-products without consuming an excessive/large amount of energy and in addition, mitigates the fouling in long periods of treatment, which also decreases the maintenance and replacement cost of the separation process.

Keywords: forward osmosis, nanofiltration, mining, draw solution, divalent solute

Procedia PDF Downloads 118
16099 The Lethal Autonomy and Military Targeting Process

Authors: Serdal Akyüz, Halit Turan, Mehmet Öztürk

Abstract:

The future security environment will have new battlefield and enemies. The boundaries of battlefield and the identity of enemies cannot be noticed easily. The politicians may not want to lose their soldiers in very risky operations. This approach will pave the way for smart machines like war robots and new drones. These machines will have the decision-making ability and act simultaneously. This ability can change the military targeting process. Military targeting process (MTP) benefits from a wide scope of lethal and non-lethal weapons to reach an intended end-state. This process is now managed by people but in the future smart machines can do it by themselves. At first sight, this development seems useful for humanity owing to decrease the casualties in war. Using robots -which can decide, detect, deliver and asses without human support- for homeland security and against terrorist has very crucial risks and threats. Besides, it can decrease the havoc but also increase the collateral damages. This paper examines the current use of smart war machines, military targeting process and presents a new approach to MTP from lethal autonomy concept's point of view.

Keywords: the autonomous weapon systems, the lethal autonomy, military targeting process (MTP)

Procedia PDF Downloads 428
16098 Using Gaussian Process in Wind Power Forecasting

Authors: Hacene Benkhoula, Mohamed Badreddine Benabdella, Hamid Bouzeboudja, Abderrahmane Asraoui

Abstract:

The wind is a random variable difficult to master, for this, we developed a mathematical and statistical methods enable to modeling and forecast wind power. Gaussian Processes (GP) is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space or time and space. GP is an underlying process formed by unrecognized operator’s uses to solve a problem. The purpose of this paper is to present how to forecast wind power by using the GP. The Gaussian process method for forecasting are presented. To validate the presented approach, a simulation under the MATLAB environment has been given.

Keywords: wind power, Gaussien process, modelling, forecasting

Procedia PDF Downloads 417
16097 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 94
16096 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru

Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar

Abstract:

Nowadays, heritage building information modeling (HBIM) is considered an efficient tool to represent and manage information of cultural heritage (CH). The basis of this tool relies on a 3D model generally obtained from a cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired level of development (LOD), level of information (LOI), grade of generation (GOG), as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit, and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings, and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills, and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models families, respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI, and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources since the BIM software used has a free student license.

Keywords: cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit

Procedia PDF Downloads 142
16095 Transformer-Driven Multi-Category Classification for an Automated Academic Strand Recommendation Framework

Authors: Ma Cecilia Siva

Abstract:

This study introduces a Bidirectional Encoder Representations from Transformers (BERT)-based machine learning model aimed at improving educational counseling by automating the process of recommending academic strands for students. The framework is designed to streamline and enhance the strand selection process by analyzing students' profiles and suggesting suitable academic paths based on their interests, strengths, and goals. Data was gathered from a sample of 200 grade 10 students, which included personal essays and survey responses relevant to strand alignment. After thorough preprocessing, the text data was tokenized, label-encoded, and input into a fine-tuned BERT model set up for multi-label classification. The model was optimized for balanced accuracy and computational efficiency, featuring a multi-category classification layer with sigmoid activation for independent strand predictions. Performance metrics showed an F1 score of 88%, indicating a well-balanced model with precision at 80% and recall at 100%, demonstrating its effectiveness in providing reliable recommendations while reducing irrelevant strand suggestions. To facilitate practical use, the final deployment phase created a recommendation framework that processes new student data through the trained model and generates personalized academic strand suggestions. This automated recommendation system presents a scalable solution for academic guidance, potentially enhancing student satisfaction and alignment with educational objectives. The study's findings indicate that expanding the data set, integrating additional features, and refining the model iteratively could improve the framework's accuracy and broaden its applicability in various educational contexts.

Keywords: tokenized, sigmoid activation, transformer, multi category classification

Procedia PDF Downloads 8
16094 Energy Efficiency Analysis of Crossover Technologies in Industrial Applications

Authors: W. Schellong

Abstract:

Industry accounts for one-third of global final energy demand. Crossover technologies (e.g. motors, pumps, process heat, and air conditioning) play an important role in improving energy efficiency. These technologies are used in many applications independent of the production branch. Especially electrical power is used by drives, pumps, compressors, and lightning. The paper demonstrates the algorithm of the energy analysis by some selected case studies for typical industrial processes. The energy analysis represents an essential part of energy management systems (EMS). Generally, process control system (PCS) can support EMS. They provide information about the production process, and they organize the maintenance actions. Combining these tools into an integrated process allows the development of an energy critical equipment strategy. Thus, asset and energy management can use the same common data to improve the energy efficiency.

Keywords: crossover technologies, data management, energy analysis, energy efficiency, process control

Procedia PDF Downloads 210
16093 Fault Detection and Isolation in Sensors and Actuators of Wind Turbines

Authors: Shahrokh Barati, Reza Ramezani

Abstract:

Due to the countries growing attention to the renewable energy producing, the demand for energy from renewable energy has gone up among the renewable energy sources; wind energy is the fastest growth in recent years. In this regard, in order to increase the availability of wind turbines, using of Fault Detection and Isolation (FDI) system is necessary. Wind turbines include of various faults such as sensors fault, actuator faults, network connection fault, mechanical faults and faults in the generator subsystem. Although, sensors and actuators have a large number of faults in wind turbine but have discussed fewer in the literature. Therefore, in this work, we focus our attention to design a sensor and actuator fault detection and isolation algorithm and Fault-tolerant control systems (FTCS) for Wind Turbine. The aim of this research is to propose a comprehensive fault detection and isolation system for sensors and actuators of wind turbine based on data-driven approaches. To achieve this goal, the features of measurable signals in real wind turbine extract in any condition. The next step is the feature selection among the extract in any condition. The next step is the feature selection among the extracted features. Features are selected that led to maximum separation networks that implemented in parallel and results of classifiers fused together. In order to maximize the reliability of decision on fault, the property of fault repeatability is used.

Keywords: FDI, wind turbines, sensors and actuators faults, renewable energy

Procedia PDF Downloads 400
16092 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics

Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir

Abstract:

Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.

Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone

Procedia PDF Downloads 193
16091 Canned Sealless Pumps for Hazardous Applications

Authors: Shuja Alharbi

Abstract:

Oil and Gas industry has many applications considered as toxic or hazardous, where process fluid leakage is not permitted and leads to health, safety, and environmental impacts. Caustic/Acidic applications, High Benzene Concentrations, Hydrogen sulfide rich oil/gas as well as liquids operating above their auto-ignition temperatures are examples of such liquids that pose as a risk to the industry operation, and for those, special arrangements are in place to allow for the safe operation environment. Pumps in the industry requires special attention, specifically in the interface between the fluid and the environment, where the potential of leakages are foreseen. Mechanical Seals are used to contain the fluid within the equipment, but the prices are ever increasing for such seals, along with maintenance, design, and operating requirements. Several alternatives to seals are being employed nowadays, such as Sealless systems, which is hermitically sealed from the atmosphere and does not require sealing. This technology is considered relatively new and requires more studies to understand the limitations and factors associated from an owner and design perspective. Things like financial factors, maintenance factors, and design limitation should be studies further in order to have a mature and reliable technical solution available to end users.

Keywords: pump, sealless, selection, failure

Procedia PDF Downloads 100
16090 A Review of the Run to Run (R to R) Control in the Manufacturing Processes

Authors: Khalil Aghapouramin, Mostafa Ranjbar

Abstract:

Run- to- Run (R2 R) control was developed in order to monitor and control different semiconductor manufacturing processes based upon the fundamental engineering frameworks. This technology allows rectification in the optimum direction. This control always had a significant potency in which was appeared in a variety of processes. The term run to run refers to the case where the act of control would take with the aim of getting batches of silicon wafers which produced in a manufacturing process. In the present work, a brief review about run-to-run control investigated which mainly is effective in the manufacturing process.

Keywords: Run-to-Run (R2R) control, manufacturing, process in engineering, manufacturing controls

Procedia PDF Downloads 493
16089 Weighted Data Replication Strategy for Data Grid Considering Economic Approach

Authors: N. Mansouri, A. Asadi

Abstract:

Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.

Keywords: data grid, data replication, simulation, replica selection, replica placement

Procedia PDF Downloads 260
16088 The Impact of Access to Microcredit Programme on Women Empowerment: A Case Study of Cowries Microfinance Bank in Lagos State, Nigeria

Authors: Adijat Olubukola Olateju

Abstract:

Women empowerment is an essential developmental tool in every economy especially in less developed countries; as it helps to enhance women's socio-economic well-being. Some empirical evidence has shown that microcredit has been an effective tool in enhancing women empowerment, especially in developing countries. This paper therefore, investigates the impact of microcredit programme on women empowerment in Lagos State, Nigeria. The study used Cowries Microfinance Bank (CMB) as a case study bank, and a total of 359 women entrepreneurs were selected by simple random sampling technique from the list of Cowries Microfinance Bank. Selection bias which could arise from non-random selection of participants or non-random placement of programme, was adjusted for by dividing the data into participant women entrepreneurs and non-participant women entrepreneurs. The data were analyzed with a Propensity Score Matching (PSM) technique. The result of the Average Treatment Effect on the Treated (ATT) obtained from the PSM indicates that the credit programme has a significant effect on the empowerment of women in the study area. It is therefore, recommended that microfinance banks should be encouraged to give loan to women and for more impact of the loan to be felt by the beneficiaries the loan programme should be complemented with other programmes such as training, grant, and periodic monitoring of programme should be encouraged.

Keywords: empowerment, microcredit, socio-economic wellbeing, development

Procedia PDF Downloads 304
16087 Uncovering the Complex Structure of Building Design Process Based on Royal Institute of British Architects Plan of Work

Authors: Fawaz A. Binsarra, Halim Boussabaine

Abstract:

The notion of complexity science has been attracting the interest of researchers and professionals due to the need of enhancing the efficiency of understanding complex systems dynamic and structure of interactions. In addition, complexity analysis has been used as an approach to investigate complex systems that contains a large number of components interacts with each other to accomplish specific outcomes and emerges specific behavior. The design process is considered as a complex action that involves large number interacted components, which are ranked as design tasks, design team, and the components of the design process. Those three main aspects of the building design process consist of several components that interact with each other as a dynamic system with complex information flow. In this paper, the goal is to uncover the complex structure of information interactions in building design process. The Investigating of Royal Institute of British Architects Plan Of Work 2013 information interactions as a case study to uncover the structure and building design process complexity using network analysis software to model the information interaction will significantly enhance the efficiency of the building design process outcomes.

Keywords: complexity, process, building desgin, Riba, design complexity, network, network analysis

Procedia PDF Downloads 527
16086 The Process of Crisis: Model of Its Development in the Organization

Authors: M. Mikušová

Abstract:

The main aim of this paper is to present a clear and comprehensive picture of the process of a crisis in the organization which will help to better understand its possible developments. For a description of the sequence of individual steps and an indication of their causation and possible variants of the developments, a detailed flow diagram with verbal comment is applied. For simplicity, the process of the crisis is observed in four basic phases called: symptoms of the crisis, diagnosis, action and prevention. The model highlights the complexity of the phenomenon of the crisis and that the various phases of the crisis are interweaving.

Keywords: crisis, management, model, organization

Procedia PDF Downloads 291
16085 Deterministic Random Number Generator Algorithm for Cryptosystem Keys

Authors: Adi A. Maaita, Hamza A. A. Al Sewadi

Abstract:

One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfills Shannon’s principle of “confusion and diffusion”. ASCII code characters wereutilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.

Keywords: cryptosystems, information security agreement, key distribution, random numbers

Procedia PDF Downloads 268