Search results for: dimensional accuracy
2584 Framework for Socio-Technical Issues in Requirements Engineering for Developing Resilient Machine Vision Systems Using Levels of Automation through the Lifecycle
Authors: Ryan Messina, Mehedi Hasan
Abstract:
This research is to examine the impacts of using data to generate performance requirements for automation in visual inspections using machine vision. These situations are intended for design and how projects can smooth the transfer of tacit knowledge to using an algorithm. We have proposed a framework when specifying machine vision systems. This framework utilizes varying levels of automation as contingency planning to reduce data processing complexity. Using data assists in extracting tacit knowledge from those who can perform the manual tasks to assist design the system; this means that real data from the system is always referenced and minimizes errors between participating parties. We propose using three indicators to know if the project has a high risk of failing to meet requirements related to accuracy and reliability. All systems tested achieved a better integration into operations after applying the framework.Keywords: automation, contingency planning, continuous engineering, control theory, machine vision, system requirements, system thinking
Procedia PDF Downloads 2112583 Using Machine Learning to Predict Answers to Big-Five Personality Questions
Authors: Aadityaa Singla
Abstract:
The big five personality traits are as follows: openness, conscientiousness, extraversion, agreeableness, and neuroticism. In order to get an insight into their personality, many flocks to these categories, which each have different meanings/characteristics. This information is important not only to individuals but also to career professionals and psychologists who can use this information for candidate assessment or job recruitment. The links between AI and psychology have been well studied in cognitive science, but it is still a rather novel development. It is possible for various AI classification models to accurately predict a personality question via ten input questions. This would contrast with the hundred questions that normal humans have to answer to gain a complete picture of their five personality traits. In order to approach this problem, various AI classification models were used on a dataset to predict what a user may answer. From there, the model's prediction was compared to its actual response. Normally, there are five answer choices (a 20% chance of correct guess), and the models exceed that value to different degrees, proving their significance. By utilizing an MLP classifier, decision tree, linear model, and K-nearest neighbors, they were able to obtain a test accuracy of 86.643, 54.625, 47.875, and 52.125, respectively. These approaches display that there is potential in the future for more nuanced predictions to be made regarding personality.Keywords: machine learning, personally, big five personality traits, cognitive science
Procedia PDF Downloads 1492582 Geophysical Methods and Machine Learning Algorithms for Stuck Pipe Prediction and Avoidance
Authors: Ammar Alali, Mahmoud Abughaban
Abstract:
Cost reduction and drilling optimization is the goal of many drilling operators. Historically, stuck pipe incidents were a major segment of non-productive time (NPT) associated costs. Traditionally, stuck pipe problems are part of the operations and solved post-sticking. However, the real key to savings and success is in predicting the stuck pipe incidents and avoiding the conditions leading to its occurrences. Previous attempts in stuck-pipe predictions have neglected the local geology of the problem. The proposed predictive tool utilizes geophysical data processing techniques and Machine Learning (ML) algorithms to predict drilling activities events in real-time using surface drilling data with minimum computational power. The method combines two types of analysis: (1) real-time prediction, and (2) cause analysis. Real-time prediction aggregates the input data, including historical drilling surface data, geological formation tops, and petrophysical data, from wells within the same field. The input data are then flattened per the geological formation and stacked per stuck-pipe incidents. The algorithm uses two physical methods (stacking and flattening) to filter any noise in the signature and create a robust pre-determined pilot that adheres to the local geology. Once the drilling operation starts, the Wellsite Information Transfer Standard Markup Language (WITSML) live surface data are fed into a matrix and aggregated in a similar frequency as the pre-determined signature. Then, the matrix is correlated with the pre-determined stuck-pipe signature for this field, in real-time. The correlation used is a machine learning Correlation-based Feature Selection (CFS) algorithm, which selects relevant features from the class and identifying redundant features. The correlation output is interpreted as a probability curve of stuck pipe incidents prediction in real-time. Once this probability passes a fixed-threshold defined by the user, the other component, cause analysis, alerts the user of the expected incident based on set pre-determined signatures. A set of recommendations will be provided to reduce the associated risk. The validation process involved feeding of historical drilling data as live-stream, mimicking actual drilling conditions, of an onshore oil field. Pre-determined signatures were created for three problematic geological formations in this field prior. Three wells were processed as case studies, and the stuck-pipe incidents were predicted successfully, with an accuracy of 76%. This accuracy of detection could have resulted in around 50% reduction in NPT, equivalent to 9% cost saving in comparison with offset wells. The prediction of stuck pipe problem requires a method to capture geological, geophysical and drilling data, and recognize the indicators of this issue at a field and geological formation level. This paper illustrates the efficiency and the robustness of the proposed cross-disciplinary approach in its ability to produce such signatures and predicting this NPT event.Keywords: drilling optimization, hazard prediction, machine learning, stuck pipe
Procedia PDF Downloads 2372581 Filtering and Reconstruction System for Grey-Level Forensic Images
Authors: Ahd Aljarf, Saad Amin
Abstract:
Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.Keywords: image filtering, image reconstruction, image processing, forensic images
Procedia PDF Downloads 3682580 A Similar Image Retrieval System for Auroral All-Sky Images Based on Local Features and Color Filtering
Authors: Takanori Tanaka, Daisuke Kitao, Daisuke Ikeda
Abstract:
The aurora is an attractive phenomenon but it is difficult to understand the whole mechanism of it. An approach of data-intensive science might be an effective approach to elucidate such a difficult phenomenon. To do that we need labeled data, which shows when and what types of auroras, have appeared. In this paper, we propose an image retrieval system for auroral all-sky images, some of which include discrete and diffuse aurora, and the other do not any aurora. The proposed system retrieves images which are similar to the query image by using a popular image recognition method. Using 300 all-sky images obtained at Tromso Norway, we evaluate two methods of image recognition methods with or without our original color filtering method. The best performance is achieved when SIFT with the color filtering is used and its accuracy is 81.7% for discrete auroras and 86.7% for diffuse auroras.Keywords: data-intensive science, image classification, content-based image retrieval, aurora
Procedia PDF Downloads 4522579 Determination of Water Pollution and Water Quality with Decision Trees
Authors: Çiğdem Bakır, Mecit Yüzkat
Abstract:
With the increasing emphasis on water quality worldwide, the search for and expanding the market for new and intelligent monitoring systems has increased. The current method is the laboratory process, where samples are taken from bodies of water, and tests are carried out in laboratories. This method is time-consuming, a waste of manpower, and uneconomical. To solve this problem, we used machine learning methods to detect water pollution in our study. We created decision trees with the Orange3 software we used in our study and tried to determine all the factors that cause water pollution. An automatic prediction model based on water quality was developed by taking many model inputs such as water temperature, pH, transparency, conductivity, dissolved oxygen, and ammonia nitrogen with machine learning methods. The proposed approach consists of three stages: preprocessing of the data used, feature detection, and classification. We tried to determine the success of our study with different accuracy metrics and the results. We presented it comparatively. In addition, we achieved approximately 98% success with the decision tree.Keywords: decision tree, water quality, water pollution, machine learning
Procedia PDF Downloads 872578 Utility Assessment Model for Wireless Technology in Construction
Authors: Yassir AbdelRazig, Amine Ghanem
Abstract:
Construction projects are information intensive in nature and involve many activities that are related to each other. Wireless technologies can be used to improve the accuracy and timeliness of data collected from construction sites and shares it with appropriate parties. Nonetheless, the construction industry tends to be conservative and shows hesitation to adopt new technologies. A main concern for owners, contractors or any person in charge on a job site is the cost of the technology in question. Wireless technologies are not cheap. There are a lot of expenses to be taken into consideration, and a study should be completed to make sure that the importance and savings resulting from the usage of this technology is worth the expenses. This research attempts to assess the effectiveness of using the appropriate wireless technologies based on criteria such as performance, reliability, and risk. The assessment is based on a utility function model that breaks down the selection issue into alternatives attribute. Then the attributes are assigned weights and single attributes are measured. Finally, single attribute are combined to develop one single aggregate utility index for each alternative.Keywords: analytic hierarchy process, decision theory, utility function, wireless technologies
Procedia PDF Downloads 3452577 An Exploration of the Dimensions of Place-Making: A South African Case Study
Authors: W. J. Strydom, K. Puren
Abstract:
Place-making is viewed here as an empowering process in which people represent, improve and maintain their spatial (natural or built) environment. With the above-mentioned in mind, place-making is multi-dimensional and include a spatial dimension (including visual properties or the end product/plan), a procedural dimension during which (negotiation/discussion of ideas with all relevant stakeholders in terms of end product/plan) and a psychological dimension (inclusion of intrinsic values and meanings related to a place in the end product/plan). These three represent dimensions of place-making. The purpose of this paper is to explore these dimensions of place-making in a case study of a local community in Ikageng, Potchefstroom, North-West Province, South Africa. This case study represents an inclusive process that strives to empower a local community (forcefully relocated due to Apartheid legislation in South Africa). This case study focussed on the inclusion of participants in the decision-making process regarding their daily environment. By means of focus group discussions and a collaborative design workshop, data is generated and ultimately creates a linkage with the theoretical dimensions of place-making. This paper contributes to the field of spatial planning due to the exploration of the dimensions of place-making and the relevancy of this process on spatial planning (especially in a South African setting).Keywords: community engagement, place-making, planning theory, spatial planning
Procedia PDF Downloads 4002576 Kinetics of Growth Rate of Microalga: The Effect of Carbon Dioxide Concentration
Authors: Retno Ambarwati Sigit Lestari
Abstract:
Microalga is one of the organisms that can be considered ideal and potential for raw material of bioenergy production, because the content of lipids in microalga is relatively high. Microalga is an aquatic organism that produces complex organic compounds from inorganic molecules using carbon dioxide as a carbon source, and sunlight for energy supply. Microalga-CO₂ fixation has potential advantages over other carbon captures and storage approaches, such as wide distribution, high photosynthetic rate, good environmental adaptability, and ease of operation. The rates of growth and CO₂ capture of microalga are influenced by CO₂ concentration and light intensity. This study quantitatively investigates the effects of CO₂ concentration on the rates of growth and CO₂ capture of a type of microalga, cultivated in bioreactors. The works include laboratory experiments as well as mathematical modelling. The mathematical models were solved numerically and the accuracy of the model was tested by the experimental data. It turned out that the mathematical model proposed can well quantitatively describe the growth and CO₂ capture of microalga, in which the effects of CO₂ concentration can be observed.Keywords: Microalga, CO2 concentration, photobioreactor, mathematical model
Procedia PDF Downloads 1302575 Correlation between Dynamic Knee Valgus with Isometric Hip External Rotators Strength during Single Leg Landing
Authors: Ahmed Fawzy, Khaled Ayad, Gh. M. Koura, W. Reda
Abstract:
The excessive frontal plane motion of the lower extremity during sports activities is thought to be a contributing factor to many traumatic and overuse injuries of the knee joint, little is known about the biomechanical factors that contribute to this loading pattern. Objectives: The purpose of this study was to investigate if there is a relationship between hip external rotators isometric strength and the value of frontal plane projection angle (FPPA) during single leg landing tasks in normal male subjects. Methods: One hundred (male) subjects free from lower extremity injuries for at least six months ago participated in this study. Their mean age was (23.25 ± 2.88) years, mean weight was (74.76 ± 13.54) (Kg), mean height was (174.23 ± 6.56) (Cm). The knee frontal plane projection angle was measured by digital video camera using single leg landing task. Hip external rotators isometric strength were assessed by portable hand held dynamometer. Muscle strength had been normalized to the body weight to obtain more accurate measurements. Results: The results demonstrated that there was no significant relationship between hip external rotators isometric strength and the value of FPPA during single leg landing tasks in normal male subjects. Conclusion: It can be concluded that there is no relationship between hip external rotators isometric strength and the value of FPPA during functional activities in normal male subjects.Keywords: 2-dimensional motion analysis, hip strength, kinematics, knee injuries
Procedia PDF Downloads 2292574 Cooling-Rate Induced Fiber Birefringence Variation in Regenerated High Birefringent Fiber
Authors: Man-Hong Lai, Dinusha S. Gunawardena, Kok-Sing Lim, Harith Ahmad
Abstract:
In this paper, we have reported birefringence manipulation in regenerated high-birefringent fiber Bragg grating (RPMG) by using CO2 laser annealing method. The results indicate that the birefringence of RPMG remains unchanged after CO2 laser annealing followed by a slow cooling process, but reduced after the fast cooling process (~5.6×10-5). After a series of annealing procedures with different cooling rates, the obtained results show that slower the cooling rate, higher the birefringence of RPMG. The volume, thermal expansion coefficient (TEC) and glass transition temperature (Tg) change of stress applying part in RPMG during the cooling process are responsible for the birefringence change. Therefore, these findings are important to the RPMG sensor in high and dynamic temperature environment. The measuring accuracy, range and sensitivity of RPMG sensor are greatly affected by its birefringence value. This work also opens up a new application of CO2 laser for fiber annealing and birefringence modification.Keywords: birefringence, CO2 laser annealing, regenerated gratings, thermal stress
Procedia PDF Downloads 4632573 Healthcare Service Quality in Indian Context
Authors: Ganesh Nivrutti Akhade
Abstract:
This paper attempts to develop a reliable and valid instrument of measuring Healthcare service quality in India, and also analyses the impact of demographic factor of respondent on healthcare service quality. In this research paper , extant literature survey, discussion with stakeholder of healthcare system such as patients, patients relative, administrators of hospitals, clinics, professionals and expert interviews were used to develop a attributes of healthcare service quality dimensions. A pilot study was conducted with a sample of 31 healthcare patients of private sector, public sector ,trust hospital ,primary health care centers and clinics was surveyed in the Nagpur Metropolitan Area. At the end fifteen dimensions—reliability, assurance, responsiveness, tangibility, empathy, affordability, respect, and caring, Attitude of staff, Technical competence, Appropriateness, Safety, continuity, Effectiveness, Availability, Financial support. This fifteen-dimensional model was validated through a content validity and construct validity. The proposed research model shows acceptable fit indices. Impact of these dimensions on the Overall Healthcare Service Quality and customer satisfaction are analyzed using multiple regression technique. Findings indicate that all dimensions carry significant impact on the Overall Healthcare Service Quality perceptions and customer satisfaction. However, availability and effectiveness dimensions carry the maximum impact on the Overall healthcare Service Quality .Keywords: healthcare, service quality, factor analysis (CFA), india, service quality dimensions
Procedia PDF Downloads 2792572 Demystifying the Power of Machine Learning in Detecting Alzheimer’s Disease through Speech Analysis: A Systematic Review
Authors: Dalia Elleuch
Abstract:
The use of machine learning in the field of healthcare has gained tremendous momentum in recent years, with the potential to revolutionize the way diseases are diagnosed and treated. In particular, the field of machine learning in the detection of degenerative diseases through language performance analysis has shown great promise and has been the subject of a growing body of research. As Alzheimer’s Disease (AD) is among the most prevalent neurodegenerative diseases, this review is designed to investigate the effectiveness of machine learning through speech analysis techniques to analyze linguistic data from patients with AD, with the goal of detecting early signs of the disease. A corpus comprising seven comparative studies with a total number of patients (n=1054) was analyzed. The finding reveals a high degree of accuracy, ranging between 83.32% and 97.18%. This systematic review sheds light on the potential of speech analysis and machine learning in the detection of AD, highlighting the need for further development and integration into clinical practice for improved patient outcomes.Keywords: machine learning, detection, neurodegenerative diseases, Alzheimer’s disease, speech analysis
Procedia PDF Downloads 72571 Counterfeit Drugs Prevention in Pharmaceutical Industry with RFID: A Framework Based On Literature Review
Authors: Zeeshan Hamid, Asher Ramish
Abstract:
The purpose of this paper is to focus on security and safety issues facing by pharmaceutical industry globally when counterfeit drugs are in question. Hence, there is an intense need to secure and authenticate pharmaceutical products in the emerging counterfeit product market. This paper will elaborate the application of radio frequency identification (RFID) in pharmaceutical industry and to identify its key benefits for patient’s care. The benefits are: help to co-ordinate the stream of supplies, accuracy in chains of supplies, maintaining trustworthy information, to manage the operations in appropriate and timely manners and finally deliver the genuine drug to patient. It is discussed that how RFID supported supply chain information sharing (SCIS) helps to combat against counterfeit drugs. And a solution how to tag pharmaceutical products; since, some products prevent RFID implementation in this industry. In this paper, a proposed model for pharma industry distribution suggested to combat against the counterfeit drugs when they are in supply chain.Keywords: supply chain, RFID, pharmaceutical industry, counterfeit drugs, patients care
Procedia PDF Downloads 3162570 Transport Related Air Pollution Modeling Using Artificial Neural Network
Authors: K. D. Sharma, M. Parida, S. S. Jain, Anju Saini, V. K. Katiyar
Abstract:
Air quality models form one of the most important components of an urban air quality management plan. Various statistical modeling techniques (regression, multiple regression and time series analysis) have been used to predict air pollution concentrations in the urban environment. These models calculate pollution concentrations due to observed traffic, meteorological and pollution data after an appropriate relationship has been obtained empirically between these parameters. Artificial neural network (ANN) is increasingly used as an alternative tool for modeling the pollutants from vehicular traffic particularly in urban areas. In the present paper, an attempt has been made to model traffic air pollution, specifically CO concentration using neural networks. In case of CO concentration, two scenarios were considered. First, with only classified traffic volume input and the second with both classified traffic volume and meteorological variables. The results showed that CO concentration can be predicted with good accuracy using artificial neural network (ANN).Keywords: air quality management, artificial neural network, meteorological variables, statistical modeling
Procedia PDF Downloads 5292569 Discrete Element Modeling of the Effect of Particle Shape on Creep Behavior of Rockfills
Authors: Yunjia Wang, Zhihong Zhao, Erxiang Song
Abstract:
Rockfills are widely used in civil engineering, such as dams, railways, and airport foundations in mountain areas. A significant long-term post-construction settlement may affect the serviceability or even the safety of rockfill infrastructures. The creep behavior of rockfills is influenced by a number of factors, such as particle size, strength and shape, water condition and stress level. However, the effect of particle shape on rockfill creep still remains poorly understood, which deserves a careful investigation. Particle-based discrete element method (DEM) was used to simulate the creep behavior of rockfills under different boundary conditions. Both angular and rounded particles were considered in this numerical study, in order to investigate the influence of particle shape. The preliminary results showed that angular particles experience more breakages and larger creep strains under one-dimensional compression than rounded particles. On the contrary, larger creep strains were observed in he rounded specimens in the direct shear test. The mechanism responsible for this difference is that the possibility of the existence of key particle in rounded particles is higher than that in angular particles. The above simulations demonstrate that the influence of particle shape on the creep behavior of rockfills can be simulated by DEM properly. The method of DEM simulation may facilitate our understanding of deformation properties of rockfill materials.Keywords: rockfills, creep behavior, particle crushing, discrete element method, boundary conditions
Procedia PDF Downloads 3142568 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets
Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille
Abstract:
3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.Keywords: color models, cultural heritage, laser scanner, photogrammetry
Procedia PDF Downloads 2822567 Deployed Confidence: The Testing in Production
Authors: Shreya Asthana
Abstract:
Testers know that the feature they tested on stage is working perfectly in production only after release went live. Sometimes something breaks in production and testers get to know through the end user’s bug raised. The panic mode starts when your staging test results do not reflect current production behavior. And you started doubting your testing skills when finally the user reported a bug to you. Testers can deploy their confidence on release day by testing on production. Once you start doing testing in production, you will see test result accuracy because it will be running on real time data and execution will be a little faster as compared to staging one due to elimination of bad data. Feature flagging, canary releases, and data cleanup can help to achieve this technique of testing. By this paper it will be easier to understand the steps to achieve production testing before making your feature live, and to modify IT company’s testing procedure, so testers can provide the bug free experience to the end users. This study is beneficial because too many people think that testing should be done in staging but not in production and now this is high time to pull out people from their old mindset of testing into a new testing world. At the end of the day, it all just matters if the features are working in production or not.Keywords: bug free production, new testing mindset, testing strategy, testing approach
Procedia PDF Downloads 802566 Evaluation of Alternative Approaches for Additional Damping in Dynamic Calculations of Railway Bridges under High-Speed Traffic
Authors: Lara Bettinelli, Bernhard Glatz, Josef Fink
Abstract:
Planning engineers and researchers use various calculation models with different levels of complexity, calculation efficiency and accuracy in dynamic calculations of railway bridges under high-speed traffic. When choosing a vehicle model to depict the dynamic loading on the bridge structure caused by passing high-speed trains, different goals are pursued: On the one hand, the selected vehicle models should allow the calculation of a bridge’s vibrations as realistic as possible. On the other hand, the computational efficiency and manageability of the models should be preferably high to enable a wide range of applications. The commonly adopted and straightforward vehicle model is the moving load model (MLM), which simplifies the train to a sequence of static axle loads moving at a constant speed over the structure. However, the MLM can significantly overestimate the structure vibrations, especially when resonance events occur. More complex vehicle models, which depict the train as a system of oscillating and coupled masses, can reproduce the interaction dynamics between the vehicle and the bridge superstructure to some extent and enable the calculation of more realistic bridge accelerations. At the same time, such multi-body models require significantly greater processing capacities and precise knowledge of various vehicle properties. The European standards allow for applying the so-called additional damping method when simple load models, such as the MLM, are used in dynamic calculations. An additional damping factor depending on the bridge span, which should take into account the vibration-reducing benefits of the vehicle-bridge interaction, is assigned to the supporting structure in the calculations. However, numerous studies show that when the current standard specifications are applied, the calculation results for the bridge accelerations are in many cases still too high compared to the measured bridge accelerations, while in other cases, they are not on the safe side. A proposal to calculate the additional damping based on extensive dynamic calculations for a parametric field of simply supported bridges with a ballasted track was developed to address this issue. In this contribution, several different approaches to determine the additional damping of the supporting structure considering the vehicle-bridge interaction when using the MLM are compared with one another. Besides the standard specifications, this includes the approach mentioned above and two additional recently published alternative formulations derived from analytical approaches. For a bridge catalogue of 65 existing bridges in Austria in steel, concrete or composite construction, calculations are carried out with the MLM for two different high-speed trains and the different approaches for additional damping. The results are compared with the calculation results obtained by applying a more sophisticated multi-body model of the trains used. The evaluation and comparison of the results allow assessing the benefits of different calculation concepts for the additional damping regarding their accuracy and possible applications. The evaluation shows that by applying one of the recently published redesigned additional damping methods, the calculation results can reflect the influence of the vehicle-bridge interaction on the design-relevant structural accelerations considerably more reliable than by using normative specifications.Keywords: Additional Damping Method, Bridge Dynamics, High-Speed Railway Traffic, Vehicle-Bridge-Interaction
Procedia PDF Downloads 1622565 Influence of Annealing on the Mechanical Properties of Polyester-Cotton Friction Spun Yarn
Authors: Sujit Kumar Sinha, R. Chattopadhyay
Abstract:
In the course of processing phases and use, fibres, yarns, or fabrics are subjected to a variety of stresses and strains, which cause the development of internal stresses. Given an opportunity, these inherent stresses try to bring back the structure to the original state. As an example, a twisted yarn always shows a tendency to untwist whenever its one end is made free. If the yarn is not held under tension, it may form snarls due to the presence of excessive torque. The running performance of such yarn or thread may, therefore, get negatively affected by it, as a snarl may not pass through the knitting or sewing needle smoothly, leading to an end break. A fabric shows a tendency to form wrinkles whenever squeezed. It may also shrink when brought to a relaxed state. In order to improve performance (i.e., dimensional stability or appearance), stabilization of the structure is needed. The stabilization can be attained through the release of internal stresses, which can be brought about by the process of annealing and/or other finishing treatments. When a fabric is subjected to heat, a change in the properties of the fibers, yarns, and fabric is expected. The degree to which the properties are affected would depend upon the condition of heat treatment and on the properties & structure of fibres, yarns, and fabric. In the present study, an attempt has been made to investigate the effect of annealing treatment on the properties of polyester cotton yarns with varying sheath structures.Keywords: friction spun yarn, annealing, tenacity, structural integrity, decay
Procedia PDF Downloads 672564 Prediction of the Thermal Parameters of a High-Temperature Metallurgical Reactor Using Inverse Heat Transfer
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study presents an inverse analysis for predicting the thermal conductivities and the heat flux of a high-temperature metallurgical reactor simultaneously. Once these thermal parameters are predicted, the time-varying thickness of the protective phase-change bank that covers the inside surface of the brick walls of a metallurgical reactor can be calculated. The enthalpy method is used to solve the melting/solidification process of the protective bank. The inverse model rests on the Levenberg-Marquardt Method (LMM) combined with the Broyden method (BM). A statistical analysis for the thermal parameter estimation is carried out. The effect of the position of the temperature sensors, total number of measurements and measurement noise on the accuracy of inverse predictions is investigated. Recommendations are made concerning the location of temperature sensors.Keywords: inverse heat transfer, phase change, metallurgical reactor, Levenberg–Marquardt method, Broyden method, bank thickness
Procedia PDF Downloads 3362563 A Comprehensive Study of Camouflaged Object Detection Using Deep Learning
Authors: Khalak Bin Khair, Saqib Jahir, Mohammed Ibrahim, Fahad Bin, Debajyoti Karmaker
Abstract:
Object detection is a computer technology that deals with searching through digital images and videos for occurrences of semantic elements of a particular class. It is associated with image processing and computer vision. On top of object detection, we detect camouflage objects within an image using Deep Learning techniques. Deep learning may be a subset of machine learning that's essentially a three-layer neural network Over 6500 images that possess camouflage properties are gathered from various internet sources and divided into 4 categories to compare the result. Those images are labeled and then trained and tested using vgg16 architecture on the jupyter notebook using the TensorFlow platform. The architecture is further customized using Transfer Learning. Methods for transferring information from one or more of these source tasks to increase learning in a related target task are created through transfer learning. The purpose of this transfer of learning methodologies is to aid in the evolution of machine learning to the point where it is as efficient as human learning.Keywords: deep learning, transfer learning, TensorFlow, camouflage, object detection, architecture, accuracy, model, VGG16
Procedia PDF Downloads 1572562 Effect of Constant and Variable Temperature on the Morphology of TiO₂ Nanotubes Prepared by Two-Step Anodization Method
Authors: Tayyaba Ghani, Mazhar Mehmood, Mohammad Mujahid
Abstract:
TiO₂ nanotubes are receiving immense attraction in the field of dye-sensitized solar cells due to their well-defined nanostructures, efficient electron transport and large surface area as compared to other one dimensional structures. In the present work, we have investigated the influence of temperature on the morphology of anodically produced self-organized Titanium oxide nanotubes (TiNTs). TiNTs are synthesized by two-step anodization method in an ethylene glycol based electrolytes containing ammonium fluoride. Experiments are performed at constant anodization voltage for two hours. An investigation by the SEM images reveals that if the temperature is kept constant during the anodizing experiment, variation in the average tube diameter is significantly reduced. However, if the temperature is not controlled then due to the exothermic nature of reactions for the formation of TiNTs, the temperature of electrolyte keep on increasing. This variation in electrolyte bath temperature introduced strong variations in tube diameter (20 nm to 160 nm) along the length of tubes. Current profiles, recorded during the anodization experiment, predict the effect of constant and varying experimental temperatures as well. In both cases, XRD results show the complete anatase crystal structure of nanotube upon annealing at 450 °C. Present work highlights the importance of constant temperature during the anodization experiments in order to develop an ordered array of nanotubes with a uniform tube diameter.Keywords: anodization, ordering, temperature, TiO₂ nanotubes
Procedia PDF Downloads 1742561 Reminiscence Therapy for Alzheimer’s Disease Restrained on Logistic Regression Based Linear Bootstrap Aggregating
Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Xianpei Li, Yanmin Yuan, Tracy Lin Huan
Abstract:
Researchers are doing enchanting research into the inherited features of Alzheimer’s disease and probable consistent therapies. In Alzheimer’s, memories are extinct in reverse order; memories formed lately are more transitory than those from formerly. Reminiscence therapy includes the conversation of past actions, trials and knowledges with another individual or set of people, frequently with the help of perceptible reminders such as photos, household and other acquainted matters from the past, music and collection of tapes. In this manuscript, the competence of reminiscence therapy for Alzheimer’s disease is measured using logistic regression based linear bootstrap aggregating. Logistic regression is used to envisage the experiential features of the patient’s memory through various therapies. Linear bootstrap aggregating shows better stability and accuracy of reminiscence therapy used in statistical classification and regression of memories related to validation therapy, supportive psychotherapy, sensory integration and simulated presence therapy.Keywords: Alzheimer’s disease, linear bootstrap aggregating, logistic regression, reminiscence therapy
Procedia PDF Downloads 3132560 Electrocatalytic Enhancement Mechanism of Dual-Atom and Single-Atom MXenes-Based Catalyst in Oxygen and Hydrogen Evolution Reactions
Authors: Xin Zhao. Xuerong Zheng. Andrey L. Rogach
Abstract:
Using single metal atoms has been considered an efficient way to develop new HER and OER catalysts. MXenes, a class of two-dimensional materials, have attracted tremendous interest as promising substrates for single-atom metal catalysts. However, there is still a lack of systematic investigations on the interaction mechanisms between various MXenes substrates and single atoms. Besides, due to the poor interaction between metal atoms and substrates resulting in low loading and stability, dual-atom MXenes-based catalysts have not been successfully synthesized. We summarized the electrocatalytic enhancement mechanism of three MXenes-based single-atom catalysts through experimental and theoretical results demonstrating the stronger hybridization between Co 3d and surface-terminated O 2p orbitals, optimizing the electronic structure of Co single atoms in the composite. This, in turn, lowers the OER and HER energy barriers and accelerates the catalytic kinetics in the case of the Co@V2CTx composite. The poor interaction between single atoms and substrates can be improved by a surface modification to synthesize dual-atom catalysts. The synergistic electronic structure enhances the stability and electrocatalytic activity of the catalyst. Our study provides guidelines for designing single-atom and dual-atom MXene-based electrocatalysts and sheds light on the origins of the catalytic activity of single-atoms on MXene substrates.Keywords: dual-atom catalyst, single-atom catalyst, MXene substrates, water splitting
Procedia PDF Downloads 732559 The Role of Artificial Intelligence in Concrete Constructions
Authors: Ardalan Tofighi Soleimandarabi
Abstract:
Artificial intelligence has revolutionized the concrete construction industry and improved processes by increasing efficiency, accuracy, and sustainability. This article examines the applications of artificial intelligence in predicting the compressive strength of concrete, optimizing mixing plans, and improving structural health monitoring systems. Artificial intelligence-based models, such as artificial neural networks (ANN) and combined machine learning techniques, have shown better performance than traditional methods in predicting concrete properties. In addition, artificial intelligence systems have made it possible to improve quality control and real-time monitoring of structures, which helps in preventive maintenance and increases the life of infrastructure. Also, the use of artificial intelligence plays an effective role in sustainable construction by optimizing material consumption and reducing waste. Although the implementation of artificial intelligence is associated with challenges such as high initial costs and the need for specialized training, it will create a smarter, more sustainable, and more affordable future for concrete structures.Keywords: artificial intelligence, concrete construction, compressive strength prediction, structural health monitoring, stability
Procedia PDF Downloads 262558 Annotation Ontology for Semantic Web Development
Authors: Hadeel Al Obaidy, Amani Al Heela
Abstract:
The main purpose of this paper is to examine the concept of semantic web and the role that ontology and semantic annotation plays in the development of semantic web services. The paper focuses on semantic web infrastructure illustrating how ontology and annotation work to provide the learning capabilities for building content semantically. To improve productivity and quality of software, the paper applies approaches, notations and techniques offered by software engineering. It proposes a conceptual model to develop semantic web services for the infrastructure of web information retrieval system of digital libraries. The developed system uses ontology and annotation to build a knowledge based system to define and link the meaning of a web content to retrieve information for users’ queries. The results are more relevant through keywords and ontology rule expansion that will be more accurate to satisfy the requested information. The level of results accuracy would be enhanced since the query semantically analyzed work with the conceptual architecture of the proposed system.Keywords: semantic web services, software engineering, semantic library, knowledge representation, ontology
Procedia PDF Downloads 1772557 Review Architectural Standards in Design and Development Children's Educational Centers
Authors: Ahmad Torkaman, Suogol Shomtob, Hadi Akbari Seddigh
Abstract:
In this paper it has been attempted to investigate the lack of attention to how specific spatial characteristics of the children except existing places such as nurseries. In order to achieve the standard center to faster children understanding their mentality is the first issue that must be studied. Exploring the spiritual characteristics and complexities of children cannot be possible except in accordance with the different aspects and background of their growth in various age periods. In order to achieving the standard center for fostering children, the first issue that must be studied understands their mentality. Exploring the spiritual qualities and complexities of children are not provided except in accordance with the characteristics and their different growth backgrounds in different age periods. According to previous researches game or playing is the most important activity that helps children to communicate and educate and sometimes therapy in specific fields. Investigating game as a proper way to train, the variety of games, the various kind of play environment and how to treat some abnormalities thereby are the issues discussed in recent research. Another consideration concerns the importance of artistic activities among children which is very evident in studying identification of their abnormalities. At the end of this study after investigating how to understand child and communicate with him/her, aiming to recognize Specific spatial characteristics for better training children, the physical and physiological criteria and characteristics is Reviewed and ends up to a list of required spaces and dimensional characteristic of spaces and needed children's equipment.Keywords: children, space, interior design, development, growth
Procedia PDF Downloads 3352556 Exploring the Impact of AI Tools in Microsoft PowerPoint
Authors: Budoor Bujeir, Noor Alaidaros, Sultana Alsolami
Abstract:
This study investigates how AI tools in Microsoft PowerPoint, such as Designer and Translation, might improve the process of creating presentations. Thanks to its sophisticated AI features, PowerPoint has become a powerful tool for effectively creating high-quality presentations. Designed to maximize user experience, key features include multilingual translation, real-time collaboration, and design ideas. A mixed-method approach was used, combining hands-on demos of particular AI technologies with a questionnaire given to both inexperienced and seasoned users. The survey examined how often individuals used these features, how helpful they thought they were, and how much time they could save. The results show that although tools like Designer are not widely used, they are recognized for improving aesthetics and saving time. The accuracy and usefulness of translation technologies in multilingual environments received high ratings, emphasizing how they promote inclusive communication. The importance of incorporating AI into productivity software is highlighted by this study, opening the door to more approachable, effective, and captivating presentation workflows.Keywords: Microsoft PowerPoint, AI features, designer, translation, presentation tools, NLP
Procedia PDF Downloads 162555 UWB Open Spectrum Access for a Smart Software Radio
Authors: Hemalatha Rallapalli, K. Lal Kishore
Abstract:
In comparison to systems that are typically designed to provide capabilities over a narrow frequency range through hardware elements, the next generation cognitive radios are intended to implement a broader range of capabilities through efficient spectrum exploitation. This offers the user the promise of greater flexibility, seamless roaming possible on different networks, countries, frequencies, etc. It requires true paradigm shift i.e., liberalization over a wide band of spectrum as well as a growth path to more and greater capability. This work contributes towards the design and implementation of an open spectrum access (OSA) feature to unlicensed users thus offering a frequency agile radio platform that is capable of performing spectrum sensing over a wideband. Thus, an ultra-wideband (UWB) radio, which has the intelligence of spectrum sensing only, unlike the cognitive radio with complete intelligence, is named as a Smart Software Radio (SSR). The spectrum sensing mechanism is implemented based on energy detection. Simulation results show the accuracy and validity of this method.Keywords: cognitive radio, energy detection, software radio, spectrum sensing
Procedia PDF Downloads 433