Search results for: computer generated holograms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5561

Search results for: computer generated holograms

4931 Convolutional Neural Network and LSTM Applied to Abnormal Behaviour Detection from Highway Footage

Authors: Rafael Marinho de Andrade, Elcio Hideti Shiguemori, Rafael Duarte Coelho dos Santos

Abstract:

Relying on computer vision, many clever things are possible in order to make the world safer and optimized on resource management, especially considering time and attention as manageable resources, once the modern world is very abundant in cameras from inside our pockets to above our heads while crossing the streets. Thus, automated solutions based on computer vision techniques to detect, react, or even prevent relevant events such as robbery, car crashes and traffic jams can be accomplished and implemented for the sake of both logistical and surveillance improvements. In this paper, we present an approach for vehicles’ abnormal behaviors detection from highway footages, in which the vectorial data of the vehicles’ displacement are extracted directly from surveillance cameras footage through object detection and tracking with a deep convolutional neural network and inserted into a long-short term memory neural network for behavior classification. The results show that the classifications of behaviors are consistent and the same principles may be applied to other trackable objects and scenarios as well.

Keywords: artificial intelligence, behavior detection, computer vision, convolutional neural networks, LSTM, highway footage

Procedia PDF Downloads 159
4930 Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments

Authors: Sarantos Psycharis

Abstract:

Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.

Keywords: arduino, computational thinking, computer programming, Labview, self-efficacy, STEM

Procedia PDF Downloads 110
4929 The Project Evaluation to Develop the Competencies, Capabilities, and Skills in Repairing Computers of People in Jompluak Local Municipality, Bang Khonthi District, Samut Songkram Province

Authors: Wilailuk Meepracha

Abstract:

The results of the study on the project evaluation to develop the competencies, capabilities, and skills in repairing computers of people in Jompluak Local Municipality, Bang Khonthi District, Samut Songkram Province showed that the overall result was good (4.33). When considering on each aspect, it was found that the highest one was on process evaluation (4.60) followed by product evaluation (4.50) and the least one was on feeding factor (3.97). When considering in details, it was found that: 1) the context aspect was high (4.23) with the highest item on the arrangement of the training situation (4.67) followed by the appropriateness of the target (4.30) and the least aspect was on the project cooperation (3.73). 2) The evaluation of average overall primary factor or feeding factor showed high value (4.23) while the highest aspect was on the capability of the trainers (4.47) followed by the suitable venue (4.33) while the least aspect was on the insufficient budget (3.47). 3) The average result of process evaluation was very high (4.60). The highest aspect was on the follow-op supervision (4.70) followed by responsibility of each project staffs (4.50) while the least aspect was on the present situation and the problems of the community (4.40). 4) The overall result of the product evaluation was very high (4.50). The highest aspect was on the diversity of the activities and the community integration (4.67) followed by project target achievement (4.63) while the least aspect was on continuation and regularity of the activities (4.33). The trainees reported high satisfaction on the project management at very high level (43.33%) while 40% reported high level and 16.67% reported moderate level. Suggestions for the project were on the additional number of the computer sets (37.78%) followed by longer training period especially on computer skills (43.48%).

Keywords: project evaluation, competency development, the capability on computer repairing and computer skills

Procedia PDF Downloads 299
4928 Development of Integrated Solid Waste Management Plan for Industrial Estates of Pakistan

Authors: Mehak Masood

Abstract:

This paper aims to design an integrated solid waste management plan for industrial estates taking Sundar Industrial Estate as case model. The issue of solid waste management is on the rise in Pakistan especially in the industrial sector. In this regard, the concept of development and establishment of industrial estates is gaining popularity nowadays. Without proper solid waste management plan it is very difficult to manage day to day affairs of industrial estates. An industrial estate contains clusters of different types of industrial units. It is necessary to identify different types of solid waste streams from each industrial cluster within the estate. In this study, Sundar Industrial Estate was taken as a case model. Primary and secondary data collection, waste assessment, waste segregation and weighing and field surveys were essential elements of the study. Wastes from each industrial process were identified and quantified. Currently 130 industries are in production but after full colonization of industries this number would reach 385. Elaborated process flow diagrams were made to characterize the recyclable and non-recyclables waste. From the study it was calculated that about 12354.1 kg/captia/day of solid waste is being generated in Sundar Industrial Estate. After the full colonization of the industrial estate, the estimated quantity will be 4756328.5 kg/captia/day. Furthermore, solid waste generated from each industrial sector was estimated. Suggestions for collection and transportation are given. Environment friendly solid waste management practices are suggested. If an effective integrated waste management system is developed and implemented it will conserve resources, create jobs, reduce poverty, conserve natural resources, protect the environment, save collection, transportation and disposal costs and extend the life of disposal sites. A major outcome of this study is an integrated solid waste management plan for the Sundar Industrial Estate which requires immediate implementation.

Keywords: integrated solid waste management plan, industrial estates, Sundar Industrial Estate, Pakistan

Procedia PDF Downloads 486
4927 Computer Based Model for Collaborative Research as a Panacea for National Development in Third World Countries

Authors: M. A. Rahman, A. O. Enikuomehin

Abstract:

Sharing commitment to reach a common goal in research by harnessing available resources from two or more parties can simply be referred to as collaborative research. Asides from avoiding duplication of research, the benefits often accrued from such research alliances include time economy as well as expenses reduction in completing such studies. Likewise, it provides an avenue to produce a wider horizon of scientific knowledge sequel to gathering of skills, knowledge and resources. In institutions of higher learning and research institutes, it often gives scholars an opportunity to strengthen the teaching and research capacity of their various institutions. Between industries and institutions, collaborative research breeds promising relationship that could be geared towards addressing different research problems such as producing and enhancing industrial-based products and services, including technological transfer. For Nigeria to take advantage of this collaboration, different issues like licensing of technology, intellectual property right, confidentiality, and funding among others, which could arise during this collaborative research programme, are identified in this paper. An important tool required to achieve this height in developing economy is the use of appropriate computer model. The paper highlights the costs of the collaborations and likewise stresses the need for evaluating the effectiveness and efficiency of such collaborative research activities and proposes an appropriate computer model to assist in this regard.

Keywords: collaborative research, developing country, computerization, model

Procedia PDF Downloads 329
4926 Approaches for Minimizing Radioactive Tritium and ¹⁴C in Advanced High Temperature Gas-Cooled Reactors

Authors: Longkui Zhu, Zhengcao Li

Abstract:

High temperature gas-cooled reactors (HTGRs) are considered as one of the next-generation advanced nuclear reactors, in which porous nuclear graphite is used as neutron moderators, reflectors, structure materials, and cooled by inert helium. Radioactive tritium and ¹⁴C are generated in terms of reactions of thermal neutrons and ⁶Li, ¹⁴N, ¹⁰B impurely within nuclear graphite and the coolant during HTGRs operation. Currently, hydrogen and nitrogen diffusion behavior together with nuclear graphite microstructure evolution were investigated to minimize the radioactive waste release, using thermogravimetric analysis, X-ray computed tomography, the BET and mercury standard porosimetry methods. It is found that the peak value of graphite weight loss emerged at 573-673 K owing to nitrogen diffusion from graphite pores to outside when the system was subjected to vacuum. Macropore volume became larger while porosity for mesopores was smaller with temperature ranging from ambient temperature to 1073 K, which was primarily induced by coalescence of the subscale pores. It is suggested that the porous nuclear graphite should be first subjected to vacuum at 573-673 K to minimize the nitrogen and the radioactive 14°C before operation in HTGRs. Then, results on hydrogen diffusion show that the diffusible hydrogen and tritium could permeate into the coolant with diffusion coefficients of > 0.5 × 10⁻⁴ cm²·s⁻¹ at 50 bar. As a consequence, the freshly-generated diffusible tritium could release quickly to outside once formed, and an effective approach for minimizing the amount of radioactive tritium is to make the impurity contents extremely low in nuclear graphite and the coolant. Besides, both two- and three-dimensional observations indicate that macro and mesopore volume along with total porosity decreased with temperature at 50 bar on account of synergistic effects of applied compression strain, sharpened pore morphology, and non-uniform temperature distribution.

Keywords: advanced high temperature gas-cooled reactor, hydrogen and nitrogen diffusion, microstructure evolution, nuclear graphite, radioactive waste management

Procedia PDF Downloads 309
4925 ChatGPT Performs at the Level of a Third-Year Orthopaedic Surgery Resident on the Orthopaedic In-training Examination

Authors: Diane Ghanem, Oscar Covarrubias, Michael Raad, Dawn LaPorte, Babar Shafiq

Abstract:

Introduction: Standardized exams have long been considered a cornerstone in measuring cognitive competency and academic achievement. Their fixed nature and predetermined scoring methods offer a consistent yardstick for gauging intellectual acumen across diverse demographics. Consequently, the performance of artificial intelligence (AI) in this context presents a rich, yet unexplored terrain for quantifying AI's understanding of complex cognitive tasks and simulating human-like problem-solving skills. Publicly available AI language models such as ChatGPT have demonstrated utility in text generation and even problem-solving when provided with clear instructions. Amidst this transformative shift, the aim of this study is to assess ChatGPT’s performance on the orthopaedic surgery in-training examination (OITE). Methods: All 213 OITE 2021 web-based questions were retrieved from the AAOS-ResStudy website. Two independent reviewers copied and pasted the questions and response options into ChatGPT Plus (version 4.0) and recorded the generated answers. All media-containing questions were flagged and carefully examined. Twelve OITE media-containing questions that relied purely on images (clinical pictures, radiographs, MRIs, CT scans) and could not be rationalized from the clinical presentation were excluded. Cohen’s Kappa coefficient was used to examine the agreement of ChatGPT-generated responses between reviewers. Descriptive statistics were used to summarize the performance (% correct) of ChatGPT Plus. The 2021 norm table was used to compare ChatGPT Plus’ performance on the OITE to national orthopaedic surgery residents in that same year. Results: A total of 201 were evaluated by ChatGPT Plus. Excellent agreement was observed between raters for the 201 ChatGPT-generated responses, with a Cohen’s Kappa coefficient of 0.947. 45.8% (92/201) were media-containing questions. ChatGPT had an average overall score of 61.2% (123/201). Its score was 64.2% (70/109) on non-media questions. When compared to the performance of all national orthopaedic surgery residents in 2021, ChatGPT Plus performed at the level of an average PGY3. Discussion: ChatGPT Plus is able to pass the OITE with a satisfactory overall score of 61.2%, ranking at the level of third-year orthopaedic surgery residents. More importantly, it provided logical reasoning and justifications that may help residents grasp evidence-based information and improve their understanding of OITE cases and general orthopaedic principles. With further improvements, AI language models, such as ChatGPT, may become valuable interactive learning tools in resident education, although further studies are still needed to examine their efficacy and impact on long-term learning and OITE/ABOS performance.

Keywords: artificial intelligence, ChatGPT, orthopaedic in-training examination, OITE, orthopedic surgery, standardized testing

Procedia PDF Downloads 78
4924 Helping Older Users Staying Connected

Authors: Q. Raza

Abstract:

Getting old is inevitable, tasks which were once simple are now a daily struggle. This paper is a study of how older users interact with web application based upon a series of experiments. The experiments conducted involved 12 participants and the experiments were split into two parts. The first set gives the users a feel of current social networks and the second set take into considerations from the participants and the results of the two are compared. This paper goes in detail on the psychological aspects such as social exclusion, Metacognition memory and Therapeutic memories and how this relates to users becoming isolated from society, social networking can be the roof on a foundation of successful computer interaction. The purpose of this paper is to carry out a study and to propose new ideas to help users to be able to use social networking sites easily and efficiently.

Keywords: cognitive psychology, special memory, social networking and human computer interaction

Procedia PDF Downloads 438
4923 UAV Based Visual Object Tracking

Authors: Vaibhav Dalmia, Manoj Phirke, Renith G

Abstract:

With the wide adoption of UAVs (unmanned aerial vehicles) in various industries by the government as well as private corporations for solving computer vision tasks it’s necessary that their potential is analyzed completely. Recent advances in Deep Learning have also left us with a plethora of algorithms to solve different computer vision tasks. This study provides a comprehensive survey on solving the Visual Object Tracking problem and explains the tradeoffs involved in building a real-time yet reasonably accurate object tracking system for UAVs by looking at existing methods and evaluating them on the aerial datasets. Finally, the best trackers suitable for UAV-based applications are provided.

Keywords: deep learning, drones, single object tracking, visual object tracking, UAVs

Procedia PDF Downloads 149
4922 The Structure and Function Investigation and Analysis of the Automatic Spin Regulator (ASR) in the Powertrain System of Construction and Mining Machines with the Focus on Dump Trucks

Authors: Amir Mirzaei

Abstract:

The powertrain system is one of the most basic and essential components in a machine. The occurrence of motion is practically impossible without the presence of this system. When power is generated by the engine, it is transmitted by the powertrain system to the wheels, which are the last parts of the system. Powertrain system has different components according to the type of use and design. When the force generated by the engine reaches to the wheels, the amount of frictional force between the tire and the ground determines the amount of traction and non-slip or the amount of slip. At various levels, such as icy, muddy, and snow-covered ground, the amount of friction coefficient between the tire and the ground decreases dramatically and considerably, which in turn increases the amount of force loss and the vehicle traction decreases drastically. This condition is caused by the phenomenon of slipping, which, in addition to the waste of energy produced, causes the premature wear of driving tires. It also causes the temperature of the transmission oil to rise too much, as a result, causes a reduction in the quality and become dirty to oil and also reduces the useful life of the clutches disk and plates inside the transmission. this issue is much more important in road construction and mining machinery than passenger vehicles and is always one of the most important and significant issues in the design discussion, in order to overcome. One of these methods is the automatic spin regulator system which is abbreviated as ASR. The importance of this method and its structure and function have solved one of the biggest challenges of the powertrain system in the field of construction and mining machinery. That this research is examined.

Keywords: automatic spin regulator, ASR, methods of reducing slipping, methods of preventing the reduction of the useful life of clutches disk and plate, methods of preventing the premature dirtiness of transmission oil, method of preventing the reduction of the useful life of tires

Procedia PDF Downloads 72
4921 Multichannel Object Detection with Event Camera

Authors: Rafael Iliasov, Alessandro Golkar

Abstract:

Object detection based on event vision has been a dynamically growing field in computer vision for the last 16 years. In this work, we create multiple channels from a single event camera and propose an event fusion method (EFM) to enhance object detection in event-based vision systems. Each channel uses a different accumulation buffer to collect events from the event camera. We implement YOLOv7 for object detection, followed by a fusion algorithm. Our multichannel approach outperforms single-channel-based object detection by 0.7% in mean Average Precision (mAP) for detection overlapping ground truth with IOU = 0.5.

Keywords: event camera, object detection with multimodal inputs, multichannel fusion, computer vision

Procedia PDF Downloads 10
4920 The Role of Virtual Geographic Environment (VGEs)

Authors: Min Chen, Hui Lin

Abstract:

VGEs are a kind of typical web- and computer-based geographic environment, with aims of merging geographic knowledge, computer technology, virtual reality technology, network technology, and geographic information technology, to provide a digital mirror of physical geographic environments to allow users to ‘feel it in person’ by a means for augmenting the senses and to ‘know it beyond reality’ through geographic phenomena simulation and collaborative geographic experiments. Many achievements have appeared in this field, but further evolution should be explored. With the exploration of the conception of VGEs, and some examples, this article illustrated the role of VGEs and their contribution to currently GIScience. Based on the above analysis, questions are proposed for discussing about the future way of VGEs.

Keywords: virtual geographic environments (VGEs), GIScience, virtual reality, geographic information systems

Procedia PDF Downloads 571
4919 Life Stories: High Quality of Life until the End with the Narrative Medicine and the Storytelling

Authors: Danila Zuffetti, Lorenzo Chiesa

Abstract:

Background: A hospice narrative interview aims at putting the sick at the core of disease and treatment allowing them to explore their most intimate facets. The aim of this work is to favor authentic narration by leading towards awareness and acceptance of terminality and to face death with serenity. Narration in palliative care aims at helping to reduce the chaos generated by the disease and to elaborate interpretations on the course of reality, besides, the narration delivered to the doctor is fundamental and communicates the meaning given to symptoms. Methods: The narrative interview has become a regular activity in the Castellini Foundation since 2017. Patients take part every week, and for more days, in one hour sessions, in a welcoming and empathic setting and the interaction with the operator leads to a gradual awareness of their terminality. Patients are submitted with free answer questions with the purpose of facilitating and stimulating self-narration. Narration has not always been linear, but patients are left free to shift in time to revisit their disease process by making use of different tools, such as digital storytelling. Results: The answers provided by the patients show to which extent the narrative interview is an instrument allowing the analysis of the stories and gives the possibility to better understand and deepen the different implications of patient and caregiver’s background. Conclusion: The narration work in the hospice demonstrates that narrative medicine is an added value. This instrument has proven useful not only in the support of patients but also for the palliative doctor to identify wishes for accompanying them to the end with dignity and serenity. The narrative interview favors the construction of an authentic therapeutic relationship. The sick are taken wholly in charge, and they are guaranteed a high quality of life until their very last instant.

Keywords: construction of an authentic therapy relationship, gradual awareness of their terminality, narrative interview, reduce the chaos generated by the desease

Procedia PDF Downloads 166
4918 Interactive IoT-Blockchain System for Big Data Processing

Authors: Abdallah Al-ZoubI, Mamoun Dmour

Abstract:

The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.

Keywords: IoT devices, blockchain, Ethereum, big data

Procedia PDF Downloads 144
4917 Using Teachers' Perceptions of Science Outreach Activities to Design an 'Optimum' Model of Science Outreach

Authors: Victoria Brennan, Andrea Mallaburn, Linda Seton

Abstract:

Science outreach programmes connect school pupils with external agencies to provide activities and experiences that enhance their exposure to science. It can be argued that these programmes not only aim to support teachers with curriculum engagement and promote scientific literacy but also provide pivotal opportunities to spark scientific interest in students. In turn, a further objective of these programmes is to increase awareness of career opportunities within this field. Although outreach work is also often described as a fun and satisfying venture, a plethora of researchers express caution to how successful the processes are to increases engagement post-16 in science. When researching the impact of outreach programmes, it is often student feedback regarding the activities or enrolment numbers to particular science courses post-16, which are generated and analysed. Although this is informative, the longevity of the programme’s impact could be better informed by the teacher’s perceptions; the evidence of which is far more limited in the literature. In addition, there are strong suggestions that teachers can have an indirect impact on a student’s own self-concept. These themes shape the focus and importance of this ongoing research project as it presents the rationale that teachers are under-used resources when it comes to considering the design of science outreach programmes. Therefore, the end result of the research will consist of a presentation of an ‘optimum’ model of outreach. The result of which should be of interest to the wider stakeholders such as universities or private or government organisations who design science outreach programmes in the hope to recruit future scientists. During phase one, questionnaires (n=52) and interviews (n=8) have generated both quantitative and qualitative data. These have been analysed using the Wilcoxon non-parametric test to compare teachers’ perceptions of science outreach interventions and thematic analysis for open-ended questions. Both of these research activities provide an opportunity for a cross-section of teacher opinions of science outreach to be obtained across all educational levels. Therefore, an early draft of the ‘optimum’ model of science outreach delivery was generated using both the wealth of literature and primary data. This final (ongoing) phase aims to refine this model using teacher focus groups to provide constructive feedback about the proposed model. The analysis uses principles of modified Grounded Theory to ensure that focus group data is used to further strengthen the model. Therefore, this research uses a pragmatist approach as it aims to focus on the strengths of the different paradigms encountered to ensure the data collected will provide the most suitable information to create an improved model of sustainable outreach. The results discussed will focus on this ‘optimum’ model and teachers’ perceptions of benefits and drawbacks when it comes to engaging with science outreach work. Although the model is still a ‘work in progress’, it provides both insight into how teachers feel outreach delivery can be a sustainable intervention tool within the classroom and what providers of such programmes should consider when designing science outreach activities.

Keywords: educational partnerships, science education, science outreach, teachers

Procedia PDF Downloads 117
4916 Hand Symbol Recognition Using Canny Edge Algorithm and Convolutional Neural Network

Authors: Harshit Mittal, Neeraj Garg

Abstract:

Hand symbol recognition is a pivotal component in the domain of computer vision, with far-reaching applications spanning sign language interpretation, human-computer interaction, and accessibility. This research paper discusses the approach with the integration of the Canny Edge algorithm and convolutional neural network. The significance of this study lies in its potential to enhance communication and accessibility for individuals with hearing impairments or those engaged in gesture-based interactions with technology. In the experiment mentioned, the data is manually collected by the authors from the webcam using Python codes, to increase the dataset augmentation, is applied to original images, which makes the model more compatible and advanced. Further, the dataset of about 6000 coloured images distributed equally in 5 classes (i.e., 1, 2, 3, 4, 5) are pre-processed first to gray images and then by the Canny Edge algorithm with threshold 1 and 2 as 150 each. After successful data building, this data is trained on the Convolutional Neural Network model, giving accuracy: 0.97834, precision: 0.97841, recall: 0.9783, and F1 score: 0.97832. For user purposes, a block of codes is built in Python to enable a window for hand symbol recognition. This research, at its core, seeks to advance the field of computer vision by providing an advanced perspective on hand sign recognition. By leveraging the capabilities of the Canny Edge algorithm and convolutional neural network, this study contributes to the ongoing efforts to create more accurate, efficient, and accessible solutions for individuals with diverse communication needs.

Keywords: hand symbol recognition, computer vision, Canny edge algorithm, convolutional neural network

Procedia PDF Downloads 57
4915 Treatment of Onshore Petroleum Drill Cuttings via Soil Washing Process: Characterization and Optimal Conditions

Authors: T. Poyai, P. Painmanakul, N. Chawaloesphonsiya, P. Dhanasin, C. Getwech, P. Wattana

Abstract:

Drilling is a key activity in oil and gas exploration and production. Drilling always requires the use of drilling mud for lubricating the drill bit and controlling the subsurface pressure. As drilling proceeds, a considerable amount of cuttings or rock fragments is generated. In general, water or Water Based Mud (WBM) serves as drilling fluid for the top hole section. The cuttings generated from this section is non-hazardous and normally applied as fill materials. On the other hand, drilling the bottom hole to reservoir section uses Synthetic Based Mud (SBM) of which synthetic oils are composed. The bottom-hole cuttings, SBM cuttings, is regarded as a hazardous waste, in accordance with the government regulations, due to the presence of hydrocarbons. Currently, the SBM cuttings are disposed of as an alternative fuel and raw material in cement kiln. Instead of burning, this work aims to propose an alternative for drill cuttings management under two ultimate goals: (1) reduction of hazardous waste volume; and (2) making use of the cleaned cuttings. Soil washing was selected as the major treatment process. The physiochemical properties of drill cuttings were analyzed, such as size fraction, pH, moisture content, and hydrocarbons. The particle size of cuttings was analyzed via light scattering method. Oil present in cuttings was quantified in terms of total petroleum hydrocarbon (TPH) through gas chromatography equipped with flame ionization detector (GC-FID). Other components were measured by the standard methods for soil analysis. Effects of different washing agents, liquid-to-solid (L/S) ratio, washing time, mixing speed, rinse-to-solid (R/S) ratio, and rinsing time were also evaluated. It was found that drill cuttings held the electrical conductivity of 3.84 dS/m, pH of 9.1, and moisture content of 7.5%. The TPH in cuttings existed in the diesel range with the concentration ranging from 20,000 to 30,000 mg/kg dry cuttings. A majority of cuttings particles held a mean diameter of 50 µm, which represented silt fraction. The results also suggested that a green solvent was considered most promising for cuttings treatment regarding occupational health, safety, and environmental benefits. The optimal washing conditions were obtained at L/S of 5, washing time of 15 min, mixing speed of 60 rpm, R/S of 10, and rinsing time of 1 min. After washing process, three fractions including clean cuttings, spent solvent, and wastewater were considered and provided with recommendations. The residual TPH less than 5,000 mg/kg was detected in clean cuttings. The treated cuttings can be then used for various purposes. The spent solvent held the calorific value of higher than 3,000 cal/g, which can be used as an alternative fuel. Otherwise, the recovery of the used solvent can be conducted using distillation or chromatography techniques. Finally, the generated wastewater can be combined with the produced water and simultaneously managed by re-injection into the reservoir.

Keywords: drill cuttings, green solvent, soil washing, total petroleum hydrocarbon (TPH)

Procedia PDF Downloads 150
4914 Optimal Operation of a Photovoltaic Induction Motor Drive Water Pumping System

Authors: Nelson K. Lujara

Abstract:

The performance characteristics of a photovoltaic induction motor drive water pumping system with and without maximum power tracker is analyzed and presented. The analysis is done through determination and assessment of critical loss components in the system using computer aided design (CAD) tools for optimal operation of the system. The results can be used to formulate a well-calibrated computer aided design package of photovoltaic water pumping systems based on the induction motor drive. The results allow the design engineer to pre-determine the flow rate and efficiency of the system to suit particular application.

Keywords: photovoltaic, water pumping, losses, induction motor

Procedia PDF Downloads 296
4913 Effects of Computer-Mediated Dictionaries on Reading Comprehension and Vocabulary Acquisition

Authors: Mohamed Amin Mekheimer

Abstract:

This study aimed to investigate the effects of paper-based monolingual, pop-up and type-in electronic dictionaries on improving reading comprehension and incidental vocabulary acquisition and retention in an EFL context. It tapped into how computer-mediated dictionaries may have facilitated/impeded reading comprehension and vocabulary acquisition. Findings showed differential effects produced by the three treatments compared with the control group. Specifically, it revealed that the pop-up dictionary condition had the shortest average vocabulary searching time, vocabulary and text reading time, yet with less than the type-in dictionary group but more than the book dictionary group in terms of frequent dictionary 'look-ups' (p<.0001). In addition, ANOVA analyses also showed that text reading time differed significantly across all four treatments, and so did reading comprehension. Vocabulary acquisition was reported as enhanced in the three treatments rather than in the control group, but still with insignificant differences across the three treatments, yet with more differential effects in favour of the pop-up condition. Data also assert that participants preferred the pop-up e-dictionary more than the type-in and paper-based groups. Explanations of the findings vis-à-vis the cognitive load theory were presented. Pedagogical implications and suggestions for further research were forwarded at the end.

Keywords: computer-mediated dictionaries, type-in dictionaries, pop-up dictionaries, reading comprehension, vocabulary acquisition

Procedia PDF Downloads 430
4912 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 85
4911 Digital Media Market, Multimedia, and Computer Graphic Analysis Amidst Fluctuating Global and Local Scale Economy

Authors: Essang Anwana Onuntuei, Chinyere Blessing Azunwoke

Abstract:

The study centred on investigating the influence of multimedia systems and computer graphic design on global and local scale economies. Firstly, the study pinpointed the significant participants and top five global digital media distribution in the digital media market. Then, the study investigated whether a tie or variance existed between the digital media vendor and market shares. Also, the paper probed whether the global and local desktop, mobile, and tablet markets differ while assessing the association between the top five digital media and global market shares. Finally, the study explored the extent of growth, economic gains, major setbacks, and opportunities within the industry amidst global and local scale economic flux. A multiple regression analysis method was employed to analyse the significant influence of the top five global digital media on the total market share, and the Analysis of Variance (ANOVA) was used to analyse the global digital media vendor market share data. The findings were intriguing and significant.

Keywords: computer graphics, digital media market, global market share, market size, media vendor, multimedia, social media, systems design

Procedia PDF Downloads 16
4910 User Experience Measurement of User Interfaces

Authors: Mohammad Hashemi, John Herbert

Abstract:

Quantifying and measuring Quality of Experience (QoE) are important and difficult concerns in Human Computer Interaction (HCI). Quality of Service (QoS) and the actual User Interface (UI) of the application are both important contributors to the QoE of a user. This paper describes a framework that measures accurately the way a user uses the UI in order to model users' behaviours and profiles. It monitors the use of the mouse and use of UI elements with accurate time measurement. It does this in real-time and does so unobtrusively and efficiently allowing the user to work as normal with the application. This real-time accurate measurement of the user's interaction provides valuable data and insight into the use of the UI, and is also the basis for analysis of the user's QoE.

Keywords: user modelling, user interface experience, quality of experience, user experience, human and computer interaction

Procedia PDF Downloads 496
4909 Facial Expression Recognition Using Sparse Gaussian Conditional Random Field

Authors: Mohammadamin Abbasnejad

Abstract:

The analysis of expression and facial Action Units (AUs) detection are very important tasks in fields of computer vision and Human Computer Interaction (HCI) due to the wide range of applications in human life. Many works have been done during the past few years which has their own advantages and disadvantages. In this work, we present a new model based on Gaussian Conditional Random Field. We solve our objective problem using ADMM and we show how well the proposed model works. We train and test our work on two facial expression datasets, CK+, and RU-FACS. Experimental evaluation shows that our proposed approach outperform state of the art expression recognition.

Keywords: Gaussian Conditional Random Field, ADMM, convergence, gradient descent

Procedia PDF Downloads 346
4908 Reducing System Delay to Definitive Care For STEMI Patients, a Simulation of Two Different Strategies in the Brugge Area, Belgium

Authors: E. Steen, B. Dewulf, N. Müller, C. Vandycke, Y. Vandekerckhove

Abstract:

Introduction: The care for a ST-elevation myocardial infarction (STEMI) patient is time-critical. Reperfusion therapy within 90 minutes of initial medical contact is mandatory in the improvement of the outcome. Primary percutaneous coronary intervention (PCI) without previous fibrinolytic treatment, is the preferred reperfusion strategy in patients with STEMI, provided it can be performed within guideline-mandated times. Aim of the study: During a one year period (January 2013 to December 2013) the files of all consecutive STEMI patients with urgent referral from non-PCI facilities for primary PCI were reviewed. Special attention was given to a subgroup of patients with prior out-of-hospital medical contact generated by the 112-system. In an effort to reduce out-of-hospital system delay to definitive care a change in pre-hospital 112 dispatch strategies is proposed for these time-critical patients. Actual time recordings were compared with travel time simulations for two suggested scenarios. A first scenario (SC1) involves the decision by the on scene ground EMS (GEMS) team to transport the out-of-hospital diagnosed STEMI patient straight forward to a PCI centre bypassing the nearest non-PCI hospital. Another strategy (SC2) explored the potential role of helicopter EMS (HEMS) where the on scene GEMS team requests a PCI-centre based HEMS team for immediate medical transfer to the PCI centre. Methods and Results: 49 (29,1% of all) STEMI patients were referred to our hospital for emergency PCI by a non-PCI facility. 1 file was excluded because of insufficient data collection. Within this analysed group of 48 secondary referrals 21 patients had an out-of-hospital medical contact generated by the 112-system. The other 27 patients presented at the referring emergency department without prior contact with the 112-system. The table below shows the actual time data from first medical contact to definitive care as well as the simulated possible gain of time for both suggested strategies. The PCI-team was always alarmed upon departure from the referring centre excluding further in-hospital delay. Time simulation tools were similar to those used by the 112-dispatch centre. Conclusion: Our data analysis confirms prolonged reperfusion times in case of secondary emergency referrals for STEMI patients even with the use of HEMS. In our setting there was no statistical difference in gain of time between the two suggested strategies, both reducing the secondary referral generated delay with about one hour and by this offering all patients PCI within the guidelines mandated time. However, immediate HEMS activation by the on scene ground EMS team for transport purposes is preferred. This ensures a faster availability of the local GEMS-team for its community. In case these options are not available and the guideline-mandated times for primary PCI are expected to be exceeded, primary fibrinolysis should be considered in a non-PCI centre.

Keywords: STEMI, system delay, HEMS, emergency medicine

Procedia PDF Downloads 317
4907 Microfabrication and Non-Invasive Imaging of Porous Osteogenic Structures Using Laser-Assisted Technologies

Authors: Irina Alexandra Paun, Mona Mihailescu, Marian Zamfirescu, Catalin Romeo Luculescu, Adriana Maria Acasandrei, Cosmin Catalin Mustaciosu, Roxana Cristina Popescu, Maria Dinescu

Abstract:

A major concern in bone tissue engineering is to develop complex 3D architectures that mimic the natural cells environment, facilitate the cells growth in a defined manner and allow the flow transport of nutrients and metabolic waste. In particular, porous structures of controlled pore size and positioning are indispensable for growing human-like bone structures. Another concern is to monitor both the structures and the seeded cells with high spatial resolution and without interfering with the cells natural environment. The present approach relies on laser-based technologies employed for fabricating porous biomimetic structures that support the growth of osteoblast-like cells and for their non-invasive 3D imaging. Specifically, the porous structures were built by two photon polymerization –direct writing (2PP_DW) of the commercially available photoresists IL-L780, using the Photonic Professional 3D lithography system. The structures consist of vertical tubes with micrometer-sized heights and diameters, in a honeycomb-like spatial arrangement. These were fabricated by irradiating the IP-L780 photoresist with focused laser pulses with wavelength centered at 780 nm, 120 fs pulse duration and 80 MHz repetition rate. The samples were precisely scanned in 3D by piezo stages. The coarse positioning was done by XY motorized stages. The scanning path was programmed through a writing language (GWL) script developed by Nanoscribe. Following laser irradiation, the unexposed regions of the photoresist were washed out by immersing the samples in the Propylene Glycol Monomethyl Ether Acetate (PGMEA). The porous structures were seeded with osteoblast like MG-63 cells and their osteogenic potential was tested in vitro. The cell-seeded structures were analyzed in 3D using the digital holographic microscopy technique (DHM). DHM is a marker free and high spatial resolution imaging tool, where the hologram acquisition is performed non-invasively i.e. without interfering with the cells natural environment. Following hologram recording, a digital algorithm provided a 3D image of the sample, as well as information about its refractive index, which is correlated with the intracellular content. The axial resolution of the images went down to the nanoscale, while the temporal scales ranged from milliseconds up to hours. The hologram did not involve sample scanning and the whole image was available in one frame recorded going over 200μm field of view. The digital holograms processing provided 3D quantitative information on the porous structures and allowed a quantitative analysis of the cellular response in respect to the porous architectures. The cellular shape and dimensions were found to be influenced by the underlying micro relief. Furthermore, the intracellular content gave evidence on the beneficial role of the porous structures in promoting osteoblast differentiation. In all, the proposed laser-based protocol emerges as a promising tool for the fabrication and non-invasive imaging of porous constructs for bone tissue engineering. Acknowledgments: This work was supported by a grant of the Romanian Authority for Scientific Research and Innovation, CNCS-UEFISCDI, project PN-II-RU-TE-2014-4-2534 (contract 97 from 01/10/2015) and by UEFISCDI PN-II-PT-PCCA no. 6/2012. A part of this work was performed in the CETAL laser facility, supported by the National Program PN 16 47 - LAPLAS IV.

Keywords: biomimetic, holography, laser, osteoblast, two photon polymerization

Procedia PDF Downloads 267
4906 Numerical Studies on Bypass Thrust Augmentation Using Convective Heat Transfer in Turbofan Engine

Authors: R. Adwaith, J. Gopinath, Vasantha Kohila B., R. Chandru, Arul Prakash R.

Abstract:

The turbofan engine is a type of air breathing engine that is widely used in aircraft propulsion produces thrust mainly from the mass-flow of air bypassing the engine core. The present research has developed an effective method numerically by increasing the thrust generated from the bypass air. This thrust increase is brought about by heating the walls of the bypass valve from the combustion chamber using convective heat transfer method. It is achieved computationally by the use external heat to enhance the velocity of bypass air of turbofan engines. The bypass valves are either heated externally using multicell tube resistor which convert electricity generated by dynamos into heat or heat is transferred from the combustion chamber. This increases the temperature of the flow in the valves and thereby increase the velocity of the flow that enters the nozzle of the engine. As a result, mass-flow of air passing the core engine for producing more thrust can be significantly reduced thereby saving considerable amount of Jet fuel. Numerical analysis has been carried out on a scaled down version of a typical turbofan bypass valve, where the valve wall temperature has been increased to 700 Kelvin. It is observed from the analysis that, the exit velocity contributing to thrust has significantly increased by 10 % due to the heating of by-pass valve. The degree of optimum increase in the temperature, and the corresponding effect in the increase of jet velocity is calculated to determine the operating temperature range for efficient increase in velocity. The technique used in the research increases the thrust by using heated by-pass air without extracting much work from the fuel and thus improve the efficiency of existing turbofan engines. Dimensional analysis has been carried to prove the accuracy of the results obtained numerically.

Keywords: turbofan engine, bypass valve, multi-cell tube, convective heat transfer, thrust

Procedia PDF Downloads 353
4905 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks

Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang

Abstract:

Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.

Keywords: CNN, classification, deep learning, GAN, Resnet50

Procedia PDF Downloads 75
4904 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation

Authors: Calorine Twebaze, Jesca Balinga

Abstract:

Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.

Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches

Procedia PDF Downloads 50
4903 Influence of Counter-Face Roughness on the Friction of Bionic Microstructures

Authors: Haytam Kasem

Abstract:

The problem of quick and easy reversible attachment has become of great importance in different fields of technology. For the reason, during the last decade, a new emerging field of adhesion science has been developed. Essentially inspired by some animals and insects, which during their natural evolution have developed fantastic biological attachment systems allowing them to adhere and run on walls and ceilings of uneven surfaces. Potential applications of engineering bio-inspired solutions include climbing robots, handling systems for wafers in nanofabrication facilities, and mobile sensor platforms, to name a few. However, despite the efforts provided to apply bio-inspired patterned adhesive-surfaces to the biomedical field, they are still in the early stages compared with their conventional uses in other industries mentioned above. In fact, there are some critical issues that still need to be addressed for the wide usage of the bio-inspired patterned surfaces as advanced biomedical platforms. For example, surface durability and long-term stability of surfaces with high adhesive capacity should be improved, but also the friction and adhesion capacities of these bio-inspired microstructures when contacting rough surfaces. One of the well-known prototypes for bio-inspired attachment systems is biomimetic wall-shaped hierarchical microstructure for gecko-like attachments. Although physical background of these attachment systems is widely understood, the influence of counter-face roughness and its relationship with the friction force generated when sliding against wall-shaped hierarchical microstructure have yet to be fully analyzed and understood. To elucidate the effect of the counter-face roughness on the friction of biomimetic wall-shaped hierarchical microstructure we have replicated the isotropic topography of 12 different surfaces using replicas made of the same epoxy material. The different counter-faces were fully characterized under 3D optical profilometer to measure roughness parameters. The friction forces generated by spatula-shaped microstructure in contact with the tested counter-faces were measured on a home-made tribometer and compared with the friction forces generated by the spatulae in contact with a smooth reference. It was found that classical roughness parameters, such as average roughness Ra and others, could not be utilized to explain topography-related variation in friction force. This has led us to the development of an integrated roughness parameter obtained by combining different parameters which are the mean asperity radius of curvature (R), the asperity density (η), the deviation of asperities high (σ) and the mean asperities angle (SDQ). This new integrated parameter is capable of explaining the variation of results of friction measurements. Based on the experimental results, we developed and validated an analytical model to predict the variation of the friction force as a function of roughness parameters of the counter-face and the applied normal load, as well.

Keywords: friction, bio-mimetic micro-structure, counter-face roughness, analytical model

Procedia PDF Downloads 234
4902 The Digitalization of Occupational Health and Safety Training: A Fourth Industrial Revolution Perspective

Authors: Deonie Botha

Abstract:

Digital transformation and the digitization of occupational health and safety training have grown exponentially due to a variety of contributing factors. The literature suggests that digitalization has numerous benefits but also has associated challenges. The aim of the paper is to develop an understanding of both the perceived benefits and challenges of digitalization in an occupational health and safety context in an effort to design and develop e-learning interventions that will optimize the benefits of digitalization and address the associated challenges. The paper proposes, deliberate and tests the design principles of an e-learning intervention to ensure alignment with the requirements of a digitally transformed environment. The results of the research are based on a literature review regarding the requirements and effect of the Fourth Industrial Revolution on learning and e-learning in particular. The findings of the literature review are enhanced with empirical research in the form of a case study conducted in an organization that designs and develops e-learning content in the occupational health and safety industry. The primary findings of the research indicated that: (i) The requirements of learners and organizations in respect of e-learning are different than previously (i.e., a pre-Fourth Industrial Revolution related work setting). (ii) The design principles of an e-learning intervention need to be aligned with the entire value chain of the organization. (iii) Digital twins support and enhance the design and development of e-learning. (iv)Learning should incorporate a multitude of sensory experiences and should not only be based on visual stimulation. (v) Data that are generated as a result of e-learning interventions should be incorporated into big data streams to be analyzed and to become actionable. It is therefore concluded that there is general consensus on the requirements that e-learning interventions need to adhere to in a digitally transformed occupational health and safety work environment. The challenge remains for organizations to incorporate data generated as a result of e-learning interventions into the digital ecosystem of the organization.

Keywords: digitalization, training, fourth industrial revolution, big data

Procedia PDF Downloads 149