Search results for: robust%20scheduling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1368

Search results for: robust%20scheduling

588 Residual Lifetime Estimation for Weibull Distribution by Fusing Expert Judgements and Censored Data

Authors: Xiang Jia, Zhijun Cheng

Abstract:

The residual lifetime of a product is the operation time between the current time and the time point when the failure happens. The residual lifetime estimation is rather important in reliability analysis. To predict the residual lifetime, it is necessary to assume or verify a particular distribution that the lifetime of the product follows. And the two-parameter Weibull distribution is frequently adopted to describe the lifetime in reliability engineering. Due to the time constraint and cost reduction, a life testing experiment is usually terminated before all the units have failed. Then the censored data is usually collected. In addition, other information could also be obtained for reliability analysis. The expert judgements are considered as it is common that the experts could present some useful information concerning the reliability. Therefore, the residual lifetime is estimated for Weibull distribution by fusing the censored data and expert judgements in this paper. First, the closed-forms concerning the point estimate and confidence interval for the residual lifetime under the Weibull distribution are both presented. Next, the expert judgements are regarded as the prior information and how to determine the prior distribution of Weibull parameters is developed. For completeness, the cases that there is only one, and there are more than two expert judgements are both focused on. Further, the posterior distribution of Weibull parameters is derived. Considering that it is difficult to derive the posterior distribution of residual lifetime, a sample-based method is proposed to generate the posterior samples of Weibull parameters based on the Monte Carlo Markov Chain (MCMC) method. And these samples are used to obtain the Bayes estimation and credible interval for the residual lifetime. Finally, an illustrative example is discussed to show the application. It demonstrates that the proposed method is rather simple, satisfactory, and robust.

Keywords: expert judgements, information fusion, residual lifetime, Weibull distribution

Procedia PDF Downloads 127
587 Modern Well Logs Technology to Improve Geological Model for Libyan Deep Sand Stone Reservoir

Authors: Tarek S. Duzan, Fisal Ben Ammer, Mohamed Sula

Abstract:

In some places within Sirt Basin-Libya, it has been noticed that seismic data below pre-upper cretaceous unconformity (PUK) is hopeless to resolve the large-scale structural features and is unable to fully determine reservoir delineation. Seismic artifacts (multiples) are observed in the reservoir zone (Nubian Formation) below PUK, which complicate the process of seismic interpretation. The nature of the unconformity and the structures below are still ambiguous and not fully understood which generates a significant gap in characterizing the geometry of the reservoir, the uncertainty accompanied with lack of reliable seismic data creates difficulties in building a robust geological model. High resolution dipmeter is highly useful in steeply dipping zones. This paper uses FMl and OBMl borehole images (dipmeter) to analyze the structures below the PUK unconformity from two wells drilled recently in the North Gialo field (a mature reservoir). In addition, borehole images introduce new evidences that the PUK unconformity is angular and the bedding planes within the Nubian formation (below PUK) are significantly titled. Structural dips extracted from high resolution borehole images are used to construct a new geological model by the utilization of latest software technology. Therefore, it is important to use the advance well logs technology such as FMI-HD for any future drilling and up-date the existing model in order to minimize the structural uncertainty.

Keywords: FMI (formation micro imager), OBMI (oil base mud imager), UBI (ultra sonic borehole imager), nub sandstone reservoir in North gialo

Procedia PDF Downloads 303
586 Offline High Voltage Diagnostic Test Findings on 15MVA Generator of Basochhu Hydropower Plant

Authors: Suprit Pradhan, Tshering Yangzom

Abstract:

Even with availability of the modern day online insulation diagnostic technologies like partial discharge monitoring, the measurements like Dissipation Factor (tanδ), DC High Voltage Insulation Currents, Polarization Index (PI) and Insulation Resistance Measurements are still widely used as a diagnostic tools to assess the condition of stator insulation in hydro power plants. To evaluate the condition of stator winding insulation in one of the generators that have been operated since 1999, diagnostic tests were performed on the stator bars of 15 MVA generators of Basochhu Hydropower Plant. This paper presents diagnostic study done on the data gathered from the measurements which were performed in 2015 and 2016 as part of regular maintenance as since its commissioning no proper aging data were maintained. Measurement results of Dissipation Factor, DC High Potential tests and Polarization Index are discussed with regard to their effectiveness in assessing the ageing condition of the stator insulation. After a brief review of the theoretical background, the strengths of each diagnostic method in detecting symptoms of insulation deterioration are identified. The interesting results observed from Basochhu Hydropower Plant is taken into consideration to conclude that Polarization Index and DC High Voltage Insulation current measurements are best suited for the detection of humidity and contamination problems and Dissipation Factor measurement is a robust indicator of long-term ageing caused by oxidative degradation.

Keywords: dissipation Factor (tanδ), polarization Index (PI), DC High Voltage Insulation Current, insulation resistance (IR), Tan Delta Tip-Up, dielectric absorption ratio

Procedia PDF Downloads 293
585 Cybersecurity Challenges in Africa

Authors: Chimmoe Fomo Michelle Larissa

Abstract:

The challenges of cybersecurity in Africa are increasingly significant as the continent undergoes rapid digital transformation. With the rise of internet connectivity, mobile phone usage, and digital financial services, Africa faces unique cybersecurity threats. The significance of this study lies in understanding these threats and the multifaceted challenges that hinder effective cybersecurity measures across the continent. The methodologies employed in this study include a comprehensive analysis of existing cybersecurity frameworks in various African countries, surveys of key stakeholders in the digital ecosystem, and case studies of cybersecurity incidents. These methodologies aim to provide a detailed understanding of the current cybersecurity landscape, identify gaps in existing policies, and evaluate the effectiveness of implemented security measures. Major findings of the study indicate that Africa faces numerous cybersecurity challenges, including inadequate regulatory frameworks, insufficient cybersecurity awareness, and a shortage of skilled professionals. Additionally, the prevalence of cybercrime, such as financial fraud, data breaches, and ransomware attacks, exacerbates the situation. The study also highlights the role of international cooperation and regional collaboration in addressing these challenges and improving overall cybersecurity resilience. In conclusion, addressing cybersecurity challenges in Africa requires a multifaceted approach that involves strengthening regulatory frameworks, enhancing public awareness, and investing in cybersecurity education and training. The study underscores the importance of regional and international collaboration in building a robust cybersecurity infrastructure capable of mitigating the risks associated with the continent's digital growth.

Keywords: Africa, cybersecurity, challenges, digital infrastructure, cybercrime

Procedia PDF Downloads 14
584 The Effect of Fibre Orientation on the Mechanical Behaviour of Skeletal Muscle: A Finite Element Study

Authors: Christobel Gondwe, Yongtao Lu, Claudia Mazzà, Xinshan Li

Abstract:

Skeletal muscle plays an important role in the human body system and function by generating voluntary forces and facilitating body motion. However, The mechanical properties and behaviour of skeletal muscle are still not comprehensively known yet. As such, various robust engineering techniques have been applied to better elucidate the mechanical behaviour of skeletal muscle. It is considered that muscle mechanics are highly governed by the architecture of the fibre orientations. Therefore, the aim of this study was to investigate the effect of different fibre orientations on the mechanical behaviour of skeletal muscle.In this study, a continuum mechanics approach–finite element (FE) analysis was applied to the left bicep femoris long head to determine the contractile mechanism of the muscle using Hill’s three-element model. The geometry of the muscle was segmented from the magnetic resonance images. The muscle was modelled as a quasi-incompressible hyperelastic (Mooney-Rivlin) material. Two types of fibre orientations were implemented: one with the idealised fibre arrangement, i.e. parallel single-direction fibres going from the muscle origin to insertion sites, and the other with curved fibre arrangement which is aligned with the muscle shape.The second fibre arrangement was implemented through the finite element method; non-uniform rational B-spline (FEM-NURBs) technique by means of user material (UMAT) subroutines. The stress-strain behaviour of the muscle was investigated under idealised exercise conditions, and will be further analysed under physiological conditions. The results of the two different FE models have been outputted and qualitatively compared.

Keywords: FEM-NURBS, finite element analysis, Mooney-Rivlin hyperelastic, muscle architecture

Procedia PDF Downloads 463
583 DLtrace: Toward Understanding and Testing Deep Learning Information Flow in Deep Learning-Based Android Apps

Authors: Jie Zhang, Qianyu Guo, Tieyi Zhang, Zhiyong Feng, Xiaohong Li

Abstract:

With the widespread popularity of mobile devices and the development of artificial intelligence (AI), deep learning (DL) has been extensively applied in Android apps. Compared with traditional Android apps (traditional apps), deep learning based Android apps (DL-based apps) need to use more third-party application programming interfaces (APIs) to complete complex DL inference tasks. However, existing methods (e.g., FlowDroid) for detecting sensitive information leakage in Android apps cannot be directly used to detect DL-based apps as they are difficult to detect third-party APIs. To solve this problem, we design DLtrace; a new static information flow analysis tool that can effectively recognize third-party APIs. With our proposed trace and detection algorithms, DLtrace can also efficiently detect privacy leaks caused by sensitive APIs in DL-based apps. Moreover, using DLtrace, we summarize the non-sequential characteristics of DL inference tasks in DL-based apps and the specific functionalities provided by DL models for such apps. We propose two formal definitions to deal with the common polymorphism and anonymous inner-class problems in the Android static analyzer. We conducted an empirical assessment with DLtrace on 208 popular DL-based apps in the wild and found that 26.0% of the apps suffered from sensitive information leakage. Furthermore, DLtrace has a more robust performance than FlowDroid in detecting and identifying third-party APIs. The experimental results demonstrate that DLtrace expands FlowDroid in understanding DL-based apps and detecting security issues therein.

Keywords: mobile computing, deep learning apps, sensitive information, static analysis

Procedia PDF Downloads 144
582 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 354
581 Toehold Mediated Shape Transition of Nucleic Acid Nanoparticles

Authors: Emil F. Khisamutdinov

Abstract:

Development of functional materials undergoing structural transformations in response to an external stimulus such as environmental changes (pH, temperature, etc.), the presence of particular proteins, or short oligonucleotides are of great interest for a variety of applications ranging from medicine to electronics. The dynamic operations of most nucleic acid (NA) devices, including circuits, nano-machines, and biosensors, rely on networks of NA strand displacement processes in which an external or stimulus strand displaces a target strand from a DNA or RNA duplex. The rate of strand displacement can be greatly increased by the use of “toeholds,” single-stranded regions of the target complex to which the invading strand can bind to initiate the reaction, forming additional base pairs that provide a thermodynamic driving force for transformation. Herein, we developed a highly robust nanoparticle shape transition, sequentially transforming DNA polygons from one shape to another using the toehold-mediated DNA strand displacement technique. The shape transformation was confirmed by agarose gel electrophoresis and atomic force microscopy. Furthermore, we demonstrate that our approach is applicable for RNA shape transformation from triangle to square, which can be detected by fluorescence emission from malachite green binding RNA aptamer. Using gel-shift and fluorescence assays, we demonstrated efficient transformation occurs at isothermal conditions (37°C) that can be implemented within living cells as reporter molecules. This work is intended to provide a simple, cost-effective, and straightforward model for the development of biosensors and regulatory devices in nucleic acid nanotechnology.

Keywords: RNA nanotechnology, bionanotechnology, toehold mediated DNA switch, RNA split fluorogenic aptamers

Procedia PDF Downloads 59
580 Anti-Corruption, an Important Challenge for the Construction Industry!

Authors: Ahmed Stifi, Sascha Gentes, Fritz Gehbauer

Abstract:

The construction industry is perhaps one of the oldest industry of the world. The ancient monuments like the egyptian pyramids, the temples of Greeks and Romans like Parthenon and Pantheon, the robust bridges, old Roman theatres, the citadels and many more are the best testament to that. The industry also has a symbiotic relationship with other . Some of the heavy engineering industry provide construction machineries, chemical industry develop innovative construction materials, finance sector provides fund solutions for complex construction projects and many more. Construction Industry is not only mammoth but also very complex in nature. Because of the complexity, construction industry is prone to various tribulations which may have the propensity to hamper its growth. The comparitive study of this industry with other depicts that it is associated with a state of tardiness and delay especially when we focus on the managerial aspects and the study of triple constraint (time, cost and scope). While some institutes says the complexity associated with it as a major reason, others like lean construction, refers to the wastes produced across the construction process as the prime reason. This paper introduces corruption as one of the prime factors for such delays.To support this many international reports and studies are available depicting that construction industry is one of the most corrupt sectors worldwide, and the corruption can take place throught the project cycle comprising project selection, planning, design, funding, pre-qualification, tendering, execution, operation and maintenance, and even through the reconstrction phase. It also happens in many forms such as bribe, fraud, extortion, collusion, embezzlement and conflict of interest and the self-sufficient. As a solution to cope the corruption in construction industry, the paper introduces the integrity as a key factor and build a new integrity framework to develop and implement an integrity management system for construction companies and construction projects.

Keywords: corruption, construction industry, integrity, lean construction

Procedia PDF Downloads 363
579 Barriers to Public Innovation in Colombia: Case Study in Central Administrative Region

Authors: Yessenia Parrado, Ana Barbosa, Daniela Mahe, Sebastian Toro, Jhon Garcia

Abstract:

Public innovation has gained strength in recent years in response to the need to find new strategies or mechanisms to interact between government entities and citizens. In this way, the Colombian government has been promoting policies aimed at strengthening innovation as a fundamental aspect in the work of public entities. However, in order to potentiate the capacities of public servants and therefore of the institutions and organizations to which they belong, it is necessary to be able to understand the context under which they operate in their daily work. This article aims to compile the work developed by the laboratory of innovation, creativity, and new technologies LAB101 of the National University of Colombia for the National Department of Planning. A case study was developed in the central region of Colombia made up of five departments, through the construction of instruments based on quantitative techniques in response to the item combined with qualitative analysis through semi-structured interviews to understand the perception of possible barriers to innovation and the obstacles that have prevented the acceleration of transformation within public organizations. From the information collected, different analyzes are carried out that allows a more robust explanation to be given to the results obtained, and a set of categories are established to group different characteristics associated with possible difficulties that officials perceive to innovate and that are later conceived as barriers. Finally, a proposal for an indicator was built to measure the degree of innovation within public entities in order to be able to carry a metric in future opportunities. The main findings of this study show three key components to be strengthened in public entities and organizations: governance, knowledge management, and the promotion of collaborative workspaces.

Keywords: barriers, enablers, management, public innovation

Procedia PDF Downloads 97
578 Salvage Reconstruction of Intraoral Dehiscence following Free Fibular Flap with a Superficial Temporal Artery Islandized Flap (STAIF)

Authors: Allyne Topaz

Abstract:

Intraoral dehiscence compromises free fibula flaps following mandibular reconstruction. Salivary contamination risks thrombosis of microvascular anastomosis and hardware infection. The superficial temporal artery islandized flap (STAIF) offers an efficient, non-microsurgical reconstructive option for regaining intraoral competency for a time sensitive complication. Methods: The STAIF flap is based on the superficial temporal artery coursing along the anterior hairline. The flap is mapped with assistance of the doppler probe. The width of the skin paddle is taken based on the ability to close the donor site. The flap is taken down to the level of the zygomatic arch and tunneled into the mouth. Results: We present a case of a patient who underwent mandibular reconstruction with a free fibula flap after a traumatic shotgun wound. The patient developed repeated intraoral dehiscence following failed local buccal and floor of mouth flaps leading to salivary contamination of the flap and hardware. The intraoral dehiscence was successfully salvaged on the third attempt with a STAIF flap. Conclusions: Intraoral dehiscence creates a complication requiring urgent attention to prevent loss of free fibula flap after mandibular reconstruction. The STAIF is a non-microsurgical option for restoring intraoral competency. This robust, axially vascularized skin paddle may be split for intra- and extra-oral coverage, as needed and can be an important tool in the reconstructive armamentarium.

Keywords: free fibula flap, intraoral dehiscence, mandibular reconstruction, superficial temporal artery islandized flap

Procedia PDF Downloads 118
577 The Effect of Technology- facilitated Lesson Study toward Teacher’s Computer Assisted Language Learning Competencies

Authors: Yi-Ning Chang

Abstract:

With the rapid advancement of technology, it has become crucial for educators to adeptly integrate technology into their teaching and develop a robust Computer-Assisted Language Learning (CALL) competency. Addressing this need, the present study adopted a technology-based Lesson Study approach to assess its impact on the CALL competency and professional capabilities of EFL teachers. Additionally, the study delved into teachers' perceptions of the benefits derived from participating in the creation of technologically integrated lesson plans. The iterative process of technology-based Lesson Study facilitated ample peer discussion, enabling teachers to flexibly design and implement lesson plans that incorporate various technological tools. This 15-week study included 10 in- service teachers from a university of science and technology in the central of Taiwan. The collected data included pre- and post- lesson planning scores, pre- and post- TPACK survey scores, classroom observation forms, designed lesson plans, and reflective essays. The pre- and post- lesson planning and TPACK survey scores were analyzed employing a pair-sampled t test; students’ reflective essays were respectively analyzed applying content analysis. The findings revealed that the teachers’ lesson planning ability and CALL competencies were improved. Teachers perceived a better understanding of integrating technology with teaching subjects, more effective teaching skills, and a deeper understanding of technology. Pedagogical implications and future studies are also discussed.

Keywords: CALL, language learning, lesson study, lesson plan

Procedia PDF Downloads 13
576 Critical Success Factors Quality Requirement Change Management

Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan

Abstract:

Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.

Keywords: global software development, requirement engineering, systematic literature review, success factors

Procedia PDF Downloads 185
575 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis

Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha

Abstract:

Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.

Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier

Procedia PDF Downloads 451
574 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids

Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao

Abstract:

An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.

Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.

Procedia PDF Downloads 136
573 Exploration of a Blockchain Assisted Framework for Through Baggage Interlining: Blocklining

Authors: Mary Rose Everan, Michael McCann, Gary Cullen

Abstract:

International travel journeys, by their nature, incorporate elements provided by multiple service providers such as airlines, rail carriers, airports, and ground handlers. Data needs to be stored by and exchanged between these parties in the process of managing the journey. The fragmented nature of this shared management of mutual clients is a limiting factor in the development of a seamless, hassle-free, end-to-end travel experience. Traditional interlining agreements attempt to facilitate many separate aspects of co-operation between service providers, typically between airlines and, to some extent, intermodal travel operators, including schedules, fares, ticketing, through check-in, and baggage handling. These arrangements rely on pre-agreement. The development of Virtual Interlining - that is, interlining facilitated by a third party (often but not always an airport) without formal pre-agreement by the airlines or rail carriers - demonstrates an underlying demand for a better quality end-to-end travel experience. Blockchain solutions are being explored in a number of industries and offer, at first sight, an immutable, single source of truth for this data, avoiding data conflicts and misinterpretation. Combined with Smart Contracts, they seemingly offer a more robust and dynamic platform for multi-stakeholder ventures, and even perhaps the ability to join and leave consortia dynamically. Applying blockchain to the intermodal interlining space – termed Blocklining in this paper - is complex and multi-faceted because of the many aspects of cooperation outlined above. To explore its potential, this paper concentrates on one particular dimension, that of through baggage interlining.

Keywords: aviation, baggage, blocklining, intermodal, interlining

Procedia PDF Downloads 132
572 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 351
571 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform

Procedia PDF Downloads 210
570 Development of a Consult Liaison Psychology Service: A Systematic Review

Authors: Ben J. Lippe

Abstract:

Consult Liaison Psychology services are overgrowing, given the robust empirical support of the utility of this service in hospital settings. These psychological services, including clinical assessment, applied psychotherapy, and consultation with other healthcare providers, have been shown to improve health outcomes for patients and bolster important areas of administrative interest such as decreased length of patient admission. However, there is little descriptive literature outlining the process and mechanisms of building or developing a Consult Liaison Psychology service. The main findings of this current conceptual work are intended to be clear in nature to elucidate the essential methods involved in developing consult liaison psychology programs, including thorough reviews of relevant behavioral health literature and inclusion of experiential outcomes. The diverse range of hospital settings and healthcare systems makes a “blueprint” method of program development challenging to define, yet important structural frameworks presented here based on the relevant literature and applied practice can help lay critical groundwork for program development in this growing area of psychological service. This conceptual approach addresses the prominent processes, as well as common programmatic and clinical pitfalls, involved in the event of a Consult Liaison Psychology service. This paper, including a systematic review of relevant literature, is intended to serve as a key program development reference for the development of Consult Liaison Psychology services, other related behavioral health programs, and to help inform further research efforts.

Keywords: behavioral health, consult liaison, health psychology, psychology program development

Procedia PDF Downloads 135
569 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater

Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj

Abstract:

In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.

Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation

Procedia PDF Downloads 50
568 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method

Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga

Abstract:

Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.

Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses

Procedia PDF Downloads 243
567 Artificial Intelligence in Bioscience: The Next Frontier

Authors: Parthiban Srinivasan

Abstract:

With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.

Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction

Procedia PDF Downloads 339
566 Modeling Breathable Particulate Matter Concentrations over Mexico City Retrieved from Landsat 8 Satellite Imagery

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Magnolia G. Martinez-Rivera, Pablo de J. Angeles-Salto, Carlos Herrera-Ventosa

Abstract:

In order to diminish health risks, it is of major importance to monitor air quality. However, this process is accompanied by the high costs of physical and human resources. In this context, this research is carried out with the main objective of developing a predictive model for concentrations of inhalable particles (PM10-2.5) using remote sensing. To develop the model, satellite images, mainly from Landsat 8, of the Mexico City’s Metropolitan Area were used. Using historical PM10 and PM2.5 measurements of the RAMA (Automatic Environmental Monitoring Network of Mexico City) and through the processing of the available satellite images, a preliminary model was generated in which it was possible to observe critical opportunity areas that will allow the generation of a robust model. Through the preliminary model applied to the scenes of Mexico City, three areas were identified that cause great interest due to the presumed high concentration of PM; the zones are those that present high plant density, bodies of water and soil without constructions or vegetation. To date, work continues on this line to improve the preliminary model that has been proposed. In addition, a brief analysis was made of six models, presented in articles developed in different parts of the world, this in order to visualize the optimal bands for the generation of a suitable model for Mexico City. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air quality, modeling pollution, particulate matter, remote sensing

Procedia PDF Downloads 139
565 Entrepreneurship Education and Student Entrepreneurial Intention: A Comprehensive Review, Synthesis of Empirical Findings, and Strategic Insights for Future Research Advancements

Authors: Abdul Waris Jalili, Yanqing Wang, Som Suor

Abstract:

This research paper explores the relationship between entrepreneurship education and students' entrepreneurial intentions. It aims to determine if entrepreneurship education reliably predicts students' intention to become entrepreneurs and how and when this relationship occurs. This study aims to investigate the predictive relationship between entrepreneurship education and student entrepreneurial intentions. The goal is to understand the factors that influence this relationship and to identify any mediating or moderating factors. A thorough and systematic search and review of empirical articles published between 2013 and 2023 were conducted. Three databases, Google Scholar, Science Direct, and PubMed, were explored to gather relevant studies. Criteria such as reporting empirical results, publication in English, and addressing the research questions were used to select 35 papers for analysis. The collective findings of the reviewed studies suggest a generally positive relationship between entrepreneurship education and student entrepreneurial intentions. However, recent findings indicate that this relationship may be more complex than previously thought. Mediators and moderators have been identified, highlighting instances where entrepreneurship education indirectly influences student entrepreneurial intentions. The review also emphasizes the need for more robust research designs to establish causality in this field. This research adds to the existing literature by providing a comprehensive review of the relationship between entrepreneurship education and student entrepreneurial intentions. It highlights the complexity of this relationship and the importance of considering mediators and moderators. The study also calls for future research to explore different facets of entrepreneurship education independently and examine complex relationships more comprehensively.

Keywords: entrepreneurship, entrepreneurship education, entrepreneurial intention, entrepreneurial self-efficacy

Procedia PDF Downloads 42
564 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 165
563 Development of a Web-Based Application for Intelligent Fertilizer Management in Rice Cultivation

Authors: Hao-Wei Fu, Chung-Feng Kao

Abstract:

In the era of rapid technological advancement, information technology (IT) has become integral to modern life, exerting significant influence across diverse sectors and serving as a catalyst for development in various industries. Within agriculture, the integration of IT offers substantial benefits, notably enhancing operational efficiency. Real-time monitoring systems, for instance, have been widely embraced in agriculture, effectively improving crop management practices. This study specifically addresses the management of rice panicle fertilizer, presenting the development of a web application tailored to handle data associated with rice panicle fertilizer management. Leveraging the normalized difference red edge index, this application optimizes the quantity of rice panicle fertilizer used, providing recommendations to agricultural stakeholders and service providers in the agricultural information sector. The overarching objective is to minimize costs while maximizing yields. Furthermore, a robust database system has been established to store and manage relevant data for future reference in rice cultivation management. Additionally, the study utilizes the Representational State Transfer software architectural style to construct an application programming interface (API), facilitating data creation, retrieval, updating, and deletion for users via the HyperText Transfer Protocol methods. Future plans involve integrating this API with third-party services to incorporate it into larger frameworks, thus catering to the diverse requirements of various third-party services.

Keywords: application programming interface, HyperText Transfer Protocol, nitrogen fertilizer intelligent management, web-based application

Procedia PDF Downloads 41
562 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review

Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie

Abstract:

The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.

Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield

Procedia PDF Downloads 44
561 Role of Kerala’s Diaspora Philanthropy Engagement During Economic Crises

Authors: Shibinu S, Mohamed Haseeb N

Abstract:

In times of crisis, the diaspora's role and the help it offers are seen to be vital in determining how many countries, particularly low- and middle-income nations that significantly rely on remittances, recover. Twenty-one lakh twenty thousand Keralites have emigrated abroad, with 81.2 percent of these outflows occurring in the Gulf Cooperative Council (GCC). Most of them are semi-skilled or low-skilled laborers employed in GCC nations. Additionally, a sizeable portion of migrants are employed in industrialized nations like the UK and the US. These nations have seen the development of a highly robust Indian Diaspora. India's development is largely dependent on the generosity of its diaspora, and the nation has benefited greatly from the substantial contributions made by several emigrant generations. Its strength was noticeable during the COVID-19 and Kerala floods. Millions of people were displaced, millions of properties were damaged, and many people died as a result of the 2018 Kerala floods. The Malayalee diaspora played a crucial role in the reconstruction of Kerala by providing support for the rescue efforts underway on the ground through their extensive worldwide network. During COVID-19, an analogous outreach was also noted, in which the diaspora assisted stranded migrants across the globe. Together with the work the diaspora has done for the state's development and recovery, there has also been a recent outpouring of assistance during the COVID-19 pandemic. The study focuses on the subtleties of diaspora philanthropic scholarship and how Kerala was able to recover from the COVID-19 pandemic and floods thanks to it. Semi-structured in-depth interviews with migrants, migrant organizations, and beneficiaries from the diaspora through snowball sampling to better understand the role that diaspora philanthropy plays in times of crisis.

Keywords: crises, diaspora, remittances, COVID-19, flood, economic development of Kerala

Procedia PDF Downloads 20
560 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 84
559 Low Temperature Biological Treatment of Chemical Oxygen Demand for Agricultural Water Reuse Application Using Robust Biocatalysts

Authors: Vedansh Gupta, Allyson Lutz, Ameen Razavi, Fatemeh Shirazi

Abstract:

The agriculture industry is especially vulnerable to forecasted water shortages. In the fresh and fresh-cut produce sector, conventional flume-based washing with recirculation exhibits high water demand. This leads to a large water footprint and possible cross-contamination of pathogens. These can be alleviated through advanced water reuse processes, such as membrane technologies including reverse osmosis (RO). Water reuse technologies effectively remove dissolved constituents but can easily foul without pre-treatment. Biological treatment is effective for the removal of organic compounds responsible for fouling, but not at the low temperatures encountered at most produce processing facilities. This study showed that the Microvi MicroNiche Engineering (MNE) technology effectively removes organic compounds (> 80%) at low temperatures (6-8 °C) from wash water. The MNE technology uses synthetic microorganism-material composites with negligible solids production, making it advantageously situated as an effective bio-pretreatment for RO. A preliminary technoeconomic analysis showed 60-80% savings in operation and maintenance costs (OPEX) when using the Microvi MNE technology for organics removal. This study and the accompanying economic analysis indicated that the proposed technology process will substantially reduce the cost barrier for adopting water reuse practices, thereby contributing to increased food safety and furthering sustainable water reuse processes across the agricultural industry.

Keywords: biological pre-treatment, innovative technology, vegetable processing, water reuse, agriculture, reverse osmosis, MNE biocatalysts

Procedia PDF Downloads 114