Search results for: robust
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1353

Search results for: robust

573 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 345
572 Toehold Mediated Shape Transition of Nucleic Acid Nanoparticles

Authors: Emil F. Khisamutdinov

Abstract:

Development of functional materials undergoing structural transformations in response to an external stimulus such as environmental changes (pH, temperature, etc.), the presence of particular proteins, or short oligonucleotides are of great interest for a variety of applications ranging from medicine to electronics. The dynamic operations of most nucleic acid (NA) devices, including circuits, nano-machines, and biosensors, rely on networks of NA strand displacement processes in which an external or stimulus strand displaces a target strand from a DNA or RNA duplex. The rate of strand displacement can be greatly increased by the use of “toeholds,” single-stranded regions of the target complex to which the invading strand can bind to initiate the reaction, forming additional base pairs that provide a thermodynamic driving force for transformation. Herein, we developed a highly robust nanoparticle shape transition, sequentially transforming DNA polygons from one shape to another using the toehold-mediated DNA strand displacement technique. The shape transformation was confirmed by agarose gel electrophoresis and atomic force microscopy. Furthermore, we demonstrate that our approach is applicable for RNA shape transformation from triangle to square, which can be detected by fluorescence emission from malachite green binding RNA aptamer. Using gel-shift and fluorescence assays, we demonstrated efficient transformation occurs at isothermal conditions (37°C) that can be implemented within living cells as reporter molecules. This work is intended to provide a simple, cost-effective, and straightforward model for the development of biosensors and regulatory devices in nucleic acid nanotechnology.

Keywords: RNA nanotechnology, bionanotechnology, toehold mediated DNA switch, RNA split fluorogenic aptamers

Procedia PDF Downloads 48
571 Anti-Corruption, an Important Challenge for the Construction Industry!

Authors: Ahmed Stifi, Sascha Gentes, Fritz Gehbauer

Abstract:

The construction industry is perhaps one of the oldest industry of the world. The ancient monuments like the egyptian pyramids, the temples of Greeks and Romans like Parthenon and Pantheon, the robust bridges, old Roman theatres, the citadels and many more are the best testament to that. The industry also has a symbiotic relationship with other . Some of the heavy engineering industry provide construction machineries, chemical industry develop innovative construction materials, finance sector provides fund solutions for complex construction projects and many more. Construction Industry is not only mammoth but also very complex in nature. Because of the complexity, construction industry is prone to various tribulations which may have the propensity to hamper its growth. The comparitive study of this industry with other depicts that it is associated with a state of tardiness and delay especially when we focus on the managerial aspects and the study of triple constraint (time, cost and scope). While some institutes says the complexity associated with it as a major reason, others like lean construction, refers to the wastes produced across the construction process as the prime reason. This paper introduces corruption as one of the prime factors for such delays.To support this many international reports and studies are available depicting that construction industry is one of the most corrupt sectors worldwide, and the corruption can take place throught the project cycle comprising project selection, planning, design, funding, pre-qualification, tendering, execution, operation and maintenance, and even through the reconstrction phase. It also happens in many forms such as bribe, fraud, extortion, collusion, embezzlement and conflict of interest and the self-sufficient. As a solution to cope the corruption in construction industry, the paper introduces the integrity as a key factor and build a new integrity framework to develop and implement an integrity management system for construction companies and construction projects.

Keywords: corruption, construction industry, integrity, lean construction

Procedia PDF Downloads 345
570 Barriers to Public Innovation in Colombia: Case Study in Central Administrative Region

Authors: Yessenia Parrado, Ana Barbosa, Daniela Mahe, Sebastian Toro, Jhon Garcia

Abstract:

Public innovation has gained strength in recent years in response to the need to find new strategies or mechanisms to interact between government entities and citizens. In this way, the Colombian government has been promoting policies aimed at strengthening innovation as a fundamental aspect in the work of public entities. However, in order to potentiate the capacities of public servants and therefore of the institutions and organizations to which they belong, it is necessary to be able to understand the context under which they operate in their daily work. This article aims to compile the work developed by the laboratory of innovation, creativity, and new technologies LAB101 of the National University of Colombia for the National Department of Planning. A case study was developed in the central region of Colombia made up of five departments, through the construction of instruments based on quantitative techniques in response to the item combined with qualitative analysis through semi-structured interviews to understand the perception of possible barriers to innovation and the obstacles that have prevented the acceleration of transformation within public organizations. From the information collected, different analyzes are carried out that allows a more robust explanation to be given to the results obtained, and a set of categories are established to group different characteristics associated with possible difficulties that officials perceive to innovate and that are later conceived as barriers. Finally, a proposal for an indicator was built to measure the degree of innovation within public entities in order to be able to carry a metric in future opportunities. The main findings of this study show three key components to be strengthened in public entities and organizations: governance, knowledge management, and the promotion of collaborative workspaces.

Keywords: barriers, enablers, management, public innovation

Procedia PDF Downloads 85
569 Salvage Reconstruction of Intraoral Dehiscence following Free Fibular Flap with a Superficial Temporal Artery Islandized Flap (STAIF)

Authors: Allyne Topaz

Abstract:

Intraoral dehiscence compromises free fibula flaps following mandibular reconstruction. Salivary contamination risks thrombosis of microvascular anastomosis and hardware infection. The superficial temporal artery islandized flap (STAIF) offers an efficient, non-microsurgical reconstructive option for regaining intraoral competency for a time sensitive complication. Methods: The STAIF flap is based on the superficial temporal artery coursing along the anterior hairline. The flap is mapped with assistance of the doppler probe. The width of the skin paddle is taken based on the ability to close the donor site. The flap is taken down to the level of the zygomatic arch and tunneled into the mouth. Results: We present a case of a patient who underwent mandibular reconstruction with a free fibula flap after a traumatic shotgun wound. The patient developed repeated intraoral dehiscence following failed local buccal and floor of mouth flaps leading to salivary contamination of the flap and hardware. The intraoral dehiscence was successfully salvaged on the third attempt with a STAIF flap. Conclusions: Intraoral dehiscence creates a complication requiring urgent attention to prevent loss of free fibula flap after mandibular reconstruction. The STAIF is a non-microsurgical option for restoring intraoral competency. This robust, axially vascularized skin paddle may be split for intra- and extra-oral coverage, as needed and can be an important tool in the reconstructive armamentarium.

Keywords: free fibula flap, intraoral dehiscence, mandibular reconstruction, superficial temporal artery islandized flap

Procedia PDF Downloads 109
568 Critical Success Factors Quality Requirement Change Management

Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan

Abstract:

Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.

Keywords: global software development, requirement engineering, systematic literature review, success factors

Procedia PDF Downloads 176
567 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis

Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha

Abstract:

Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.

Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier

Procedia PDF Downloads 448
566 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids

Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao

Abstract:

An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.

Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.

Procedia PDF Downloads 126
565 Exploration of a Blockchain Assisted Framework for Through Baggage Interlining: Blocklining

Authors: Mary Rose Everan, Michael McCann, Gary Cullen

Abstract:

International travel journeys, by their nature, incorporate elements provided by multiple service providers such as airlines, rail carriers, airports, and ground handlers. Data needs to be stored by and exchanged between these parties in the process of managing the journey. The fragmented nature of this shared management of mutual clients is a limiting factor in the development of a seamless, hassle-free, end-to-end travel experience. Traditional interlining agreements attempt to facilitate many separate aspects of co-operation between service providers, typically between airlines and, to some extent, intermodal travel operators, including schedules, fares, ticketing, through check-in, and baggage handling. These arrangements rely on pre-agreement. The development of Virtual Interlining - that is, interlining facilitated by a third party (often but not always an airport) without formal pre-agreement by the airlines or rail carriers - demonstrates an underlying demand for a better quality end-to-end travel experience. Blockchain solutions are being explored in a number of industries and offer, at first sight, an immutable, single source of truth for this data, avoiding data conflicts and misinterpretation. Combined with Smart Contracts, they seemingly offer a more robust and dynamic platform for multi-stakeholder ventures, and even perhaps the ability to join and leave consortia dynamically. Applying blockchain to the intermodal interlining space – termed Blocklining in this paper - is complex and multi-faceted because of the many aspects of cooperation outlined above. To explore its potential, this paper concentrates on one particular dimension, that of through baggage interlining.

Keywords: aviation, baggage, blocklining, intermodal, interlining

Procedia PDF Downloads 126
564 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 334
563 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform

Procedia PDF Downloads 195
562 Development of a Consult Liaison Psychology Service: A Systematic Review

Authors: Ben J. Lippe

Abstract:

Consult Liaison Psychology services are overgrowing, given the robust empirical support of the utility of this service in hospital settings. These psychological services, including clinical assessment, applied psychotherapy, and consultation with other healthcare providers, have been shown to improve health outcomes for patients and bolster important areas of administrative interest such as decreased length of patient admission. However, there is little descriptive literature outlining the process and mechanisms of building or developing a Consult Liaison Psychology service. The main findings of this current conceptual work are intended to be clear in nature to elucidate the essential methods involved in developing consult liaison psychology programs, including thorough reviews of relevant behavioral health literature and inclusion of experiential outcomes. The diverse range of hospital settings and healthcare systems makes a “blueprint” method of program development challenging to define, yet important structural frameworks presented here based on the relevant literature and applied practice can help lay critical groundwork for program development in this growing area of psychological service. This conceptual approach addresses the prominent processes, as well as common programmatic and clinical pitfalls, involved in the event of a Consult Liaison Psychology service. This paper, including a systematic review of relevant literature, is intended to serve as a key program development reference for the development of Consult Liaison Psychology services, other related behavioral health programs, and to help inform further research efforts.

Keywords: behavioral health, consult liaison, health psychology, psychology program development

Procedia PDF Downloads 122
561 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater

Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj

Abstract:

In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.

Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation

Procedia PDF Downloads 35
560 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method

Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga

Abstract:

Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.

Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses

Procedia PDF Downloads 236
559 Artificial Intelligence in Bioscience: The Next Frontier

Authors: Parthiban Srinivasan

Abstract:

With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.

Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction

Procedia PDF Downloads 333
558 Modeling Breathable Particulate Matter Concentrations over Mexico City Retrieved from Landsat 8 Satellite Imagery

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Magnolia G. Martinez-Rivera, Pablo de J. Angeles-Salto, Carlos Herrera-Ventosa

Abstract:

In order to diminish health risks, it is of major importance to monitor air quality. However, this process is accompanied by the high costs of physical and human resources. In this context, this research is carried out with the main objective of developing a predictive model for concentrations of inhalable particles (PM10-2.5) using remote sensing. To develop the model, satellite images, mainly from Landsat 8, of the Mexico City’s Metropolitan Area were used. Using historical PM10 and PM2.5 measurements of the RAMA (Automatic Environmental Monitoring Network of Mexico City) and through the processing of the available satellite images, a preliminary model was generated in which it was possible to observe critical opportunity areas that will allow the generation of a robust model. Through the preliminary model applied to the scenes of Mexico City, three areas were identified that cause great interest due to the presumed high concentration of PM; the zones are those that present high plant density, bodies of water and soil without constructions or vegetation. To date, work continues on this line to improve the preliminary model that has been proposed. In addition, a brief analysis was made of six models, presented in articles developed in different parts of the world, this in order to visualize the optimal bands for the generation of a suitable model for Mexico City. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air quality, modeling pollution, particulate matter, remote sensing

Procedia PDF Downloads 130
557 Entrepreneurship Education and Student Entrepreneurial Intention: A Comprehensive Review, Synthesis of Empirical Findings, and Strategic Insights for Future Research Advancements

Authors: Abdul Waris Jalili, Yanqing Wang, Som Suor

Abstract:

This research paper explores the relationship between entrepreneurship education and students' entrepreneurial intentions. It aims to determine if entrepreneurship education reliably predicts students' intention to become entrepreneurs and how and when this relationship occurs. This study aims to investigate the predictive relationship between entrepreneurship education and student entrepreneurial intentions. The goal is to understand the factors that influence this relationship and to identify any mediating or moderating factors. A thorough and systematic search and review of empirical articles published between 2013 and 2023 were conducted. Three databases, Google Scholar, Science Direct, and PubMed, were explored to gather relevant studies. Criteria such as reporting empirical results, publication in English, and addressing the research questions were used to select 35 papers for analysis. The collective findings of the reviewed studies suggest a generally positive relationship between entrepreneurship education and student entrepreneurial intentions. However, recent findings indicate that this relationship may be more complex than previously thought. Mediators and moderators have been identified, highlighting instances where entrepreneurship education indirectly influences student entrepreneurial intentions. The review also emphasizes the need for more robust research designs to establish causality in this field. This research adds to the existing literature by providing a comprehensive review of the relationship between entrepreneurship education and student entrepreneurial intentions. It highlights the complexity of this relationship and the importance of considering mediators and moderators. The study also calls for future research to explore different facets of entrepreneurship education independently and examine complex relationships more comprehensively.

Keywords: entrepreneurship, entrepreneurship education, entrepreneurial intention, entrepreneurial self-efficacy

Procedia PDF Downloads 28
556 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 155
555 Development of a Web-Based Application for Intelligent Fertilizer Management in Rice Cultivation

Authors: Hao-Wei Fu, Chung-Feng Kao

Abstract:

In the era of rapid technological advancement, information technology (IT) has become integral to modern life, exerting significant influence across diverse sectors and serving as a catalyst for development in various industries. Within agriculture, the integration of IT offers substantial benefits, notably enhancing operational efficiency. Real-time monitoring systems, for instance, have been widely embraced in agriculture, effectively improving crop management practices. This study specifically addresses the management of rice panicle fertilizer, presenting the development of a web application tailored to handle data associated with rice panicle fertilizer management. Leveraging the normalized difference red edge index, this application optimizes the quantity of rice panicle fertilizer used, providing recommendations to agricultural stakeholders and service providers in the agricultural information sector. The overarching objective is to minimize costs while maximizing yields. Furthermore, a robust database system has been established to store and manage relevant data for future reference in rice cultivation management. Additionally, the study utilizes the Representational State Transfer software architectural style to construct an application programming interface (API), facilitating data creation, retrieval, updating, and deletion for users via the HyperText Transfer Protocol methods. Future plans involve integrating this API with third-party services to incorporate it into larger frameworks, thus catering to the diverse requirements of various third-party services.

Keywords: application programming interface, HyperText Transfer Protocol, nitrogen fertilizer intelligent management, web-based application

Procedia PDF Downloads 31
554 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review

Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie

Abstract:

The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.

Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield

Procedia PDF Downloads 34
553 Role of Kerala’s Diaspora Philanthropy Engagement During Economic Crises

Authors: Shibinu S, Mohamed Haseeb N

Abstract:

In times of crisis, the diaspora's role and the help it offers are seen to be vital in determining how many countries, particularly low- and middle-income nations that significantly rely on remittances, recover. Twenty-one lakh twenty thousand Keralites have emigrated abroad, with 81.2 percent of these outflows occurring in the Gulf Cooperative Council (GCC). Most of them are semi-skilled or low-skilled laborers employed in GCC nations. Additionally, a sizeable portion of migrants are employed in industrialized nations like the UK and the US. These nations have seen the development of a highly robust Indian Diaspora. India's development is largely dependent on the generosity of its diaspora, and the nation has benefited greatly from the substantial contributions made by several emigrant generations. Its strength was noticeable during the COVID-19 and Kerala floods. Millions of people were displaced, millions of properties were damaged, and many people died as a result of the 2018 Kerala floods. The Malayalee diaspora played a crucial role in the reconstruction of Kerala by providing support for the rescue efforts underway on the ground through their extensive worldwide network. During COVID-19, an analogous outreach was also noted, in which the diaspora assisted stranded migrants across the globe. Together with the work the diaspora has done for the state's development and recovery, there has also been a recent outpouring of assistance during the COVID-19 pandemic. The study focuses on the subtleties of diaspora philanthropic scholarship and how Kerala was able to recover from the COVID-19 pandemic and floods thanks to it. Semi-structured in-depth interviews with migrants, migrant organizations, and beneficiaries from the diaspora through snowball sampling to better understand the role that diaspora philanthropy plays in times of crisis.

Keywords: crises, diaspora, remittances, COVID-19, flood, economic development of Kerala

Procedia PDF Downloads 8
552 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 69
551 Low Temperature Biological Treatment of Chemical Oxygen Demand for Agricultural Water Reuse Application Using Robust Biocatalysts

Authors: Vedansh Gupta, Allyson Lutz, Ameen Razavi, Fatemeh Shirazi

Abstract:

The agriculture industry is especially vulnerable to forecasted water shortages. In the fresh and fresh-cut produce sector, conventional flume-based washing with recirculation exhibits high water demand. This leads to a large water footprint and possible cross-contamination of pathogens. These can be alleviated through advanced water reuse processes, such as membrane technologies including reverse osmosis (RO). Water reuse technologies effectively remove dissolved constituents but can easily foul without pre-treatment. Biological treatment is effective for the removal of organic compounds responsible for fouling, but not at the low temperatures encountered at most produce processing facilities. This study showed that the Microvi MicroNiche Engineering (MNE) technology effectively removes organic compounds (> 80%) at low temperatures (6-8 °C) from wash water. The MNE technology uses synthetic microorganism-material composites with negligible solids production, making it advantageously situated as an effective bio-pretreatment for RO. A preliminary technoeconomic analysis showed 60-80% savings in operation and maintenance costs (OPEX) when using the Microvi MNE technology for organics removal. This study and the accompanying economic analysis indicated that the proposed technology process will substantially reduce the cost barrier for adopting water reuse practices, thereby contributing to increased food safety and furthering sustainable water reuse processes across the agricultural industry.

Keywords: biological pre-treatment, innovative technology, vegetable processing, water reuse, agriculture, reverse osmosis, MNE biocatalysts

Procedia PDF Downloads 102
550 Cell-Based and Exosome Treatments for Hair Restoration

Authors: Armin Khaghani Boroujeni, Leila Dehghani, Parham Talebi Boroujeni, Sahar Rostamian, Ali Asilian

Abstract:

Background: Hair loss is a common complaint observed in both genders. Androgenetic alopecia is known pattern for hair loss. To assess new regenerative strategies (PRP, A-SC-BT, conditioned media, exosome-based treatments) compared to conventional therapies for hair loss or hair regeneration, an updated review was undertaken. To address this issue, we carried out this systematic review to comprehensively evaluate the efficacy of cell-based therapies on hair loss. Methods: The available online databases, including ISI Web of Science, Scopus, and PubMed, were searched systematically up to February 2022. The quality assessment of included studies was done using the Cochrane Collaboration's tool. Results: As a result, a total of 90 studies involving 2345 participants were included in the present study. The enrolled studies were conducted between 2010 and 2022. The subjects’ mean age ranged from 19 to 55.11 years old. Approaches using platelet rich plasma (PRP) provide a beneficial impact on hair regrowth. However, other cell-based therapies, including stem cell transplant, stem cell-derived conditioned medium, and stem cell-derived exosomes, revealed conflicting evidence. Conclusion: However, cell-based therapies for hair loss are still in their infancy, and more robust clinical studies are needed to better evaluate their mechanisms of action, efficacy, safety, benefits, and limitations. In this review, we provide the resources to the latest clinical studies and a more detailed description of the latest clinical studies concerning cell-based therapies in hair loss.

Keywords: cell-based therapy, exosome, hair restoration, systematic review

Procedia PDF Downloads 50
549 Hexane Extract of Thymus serpyllum L.: GC-MS Profile, Antioxidant Potential and Anticancer Impact on HepG2 (Liver Carcinoma) Cell Line

Authors: Salma Baig, Bakrudeen Ali Ahmad, Ainnul Hamidah Syahadah Azizan, Hapipah Mohd Ali, Elham Rouhollahi, Mahmood Ameen Abdulla

Abstract:

Free radical damage induced by reactive oxygen species (ROS) contributes to etiology of many chronic diseases, cancer being one of them. Recent studies have been successful in ROS targeted therapies via antioxidants using mouse models in cancer therapeutics. The present study was designed to scrutinize anticancer activity, antioxidant activity of 5 different extracts of Thymus serpyllum in MDA-MB-231, MCF-7, HepG2, HCT-116, PC3, and A549. Identification of the phytochemicals present in the most active extract of Thymus serpyllum was conducted using gas chromatography coupled with mass spectrophotometry and antioxidant activity was measured by using DPPH radical scavenging and FRAP assay. Anticancer impact of the extract in terms of IC50 was evaluated using MTT cell viability assay. Results revealed that the hexane extract showed the best anticancer activity in HepG2 (Liver Carcinoma Cell Line) with an IC50 value of 23 ± 0.14 µg/ml followed by 25 µg/ml in HCT-116 (Colon Cancer Cell Line), 30 µm/ml in MCF-7 (Breast Cancer Cell Line), 35 µg/ml in MDA-MB-231 (Breast Cancer Cell Line), 57 µg/ml in PC3 (Prostate Cancer Cell Line) and 60 µg/ml in A549 (Lung Carcinoma Cell Line). GC-MS profile of the hexane extract showed the presence of 31 compounds with carvacrol, thymol and thymoquione being the major compounds. Phenolics such as Vitamin E, terpinen-4-ol, borneol and phytol were also identified. Hence, here we present the first report on cytotoxicity of hexane extract of Thymus serpyllum extract in HepG2 cell line with a robust anticancer activity with an IC50 of 23 ± 0.14 µg/ml.

Keywords: Thymus serpyllum L., hexane extract, GC-MS profile, antioxidant activity, anticancer activity, HepG2 cell line

Procedia PDF Downloads 468
548 A Context Aware Mobile Learning System with a Cognitive Recommendation Engine

Authors: Jalal Maqbool, Gyu Myoung Lee

Abstract:

Using smart devices for context aware mobile learning is becoming increasingly popular. This has led to mobile learning technology becoming an indispensable part of today’s learning environment and platforms. However, some fundamental issues remain - namely, mobile learning still lacks the ability to truly understand human reaction and user behaviour. This is due to the fact that current mobile learning systems are passive and not aware of learners’ changing contextual situations. They rely on static information about mobile learners. In addition, current mobile learning platforms lack the capability to incorporate dynamic contextual situations into learners’ preferences. Thus, this thesis aims to address these issues highlighted by designing a context aware framework which is able to sense learner’s contextual situations, handle data dynamically, and which can use contextual information to suggest bespoke learning content according to a learner’s preferences. This is to be underpinned by a robust recommendation system, which has the capability to perform these functions, thus providing learners with a truly context-aware mobile learning experience, delivering learning contents using smart devices and adapting to learning preferences as and when it is required. In addition, part of designing an algorithm for the recommendation engine has to be based on learner and application needs, personal characteristics and circumstances, as well as being able to comprehend human cognitive processes which would enable the technology to interact effectively and deliver mobile learning content which is relevant, according to the learner’s contextual situations. The concept of this proposed project is to provide a new method of smart learning, based on a capable recommendation engine for providing an intuitive mobile learning model based on learner actions.

Keywords: aware, context, learning, mobile

Procedia PDF Downloads 212
547 Software-Defined Architecture and Front-End Optimization for DO-178B Compliant Distance Measuring Equipment

Authors: Farzan Farhangian, Behnam Shakibafar, Bobda Cedric, Rene Jr. Landry

Abstract:

Among the air navigation technologies, many of them are capable of increasing aviation sustainability as well as accuracy improvement in Alternative Positioning, Navigation, and Timing (APNT), especially avionics Distance Measuring Equipment (DME), Very high-frequency Omni-directional Range (VOR), etc. The integration of these air navigation solutions could make a robust and efficient accuracy in air mobility, air traffic management and autonomous operations. Designing a proper RF front-end, power amplifier and software-defined transponder could pave the way for reaching an optimized avionics navigation solution. In this article, the possibility of reaching an optimum front-end to be used with single low-cost Software-Defined Radio (SDR) has been investigated in order to reach a software-defined DME architecture. Our software-defined approach uses the firmware possibilities to design a real-time software architecture compatible with a Multi Input Multi Output (MIMO) BladeRF to estimate an accurate time delay between a Transmission (Tx) and the reception (Rx) channels using the synchronous scheduled communication. We could design a novel power amplifier for the transmission channel of the DME to pass the minimum transmission power. This article also investigates designing proper pair pulses based on the DO-178B avionics standard. Various guidelines have been tested, and the possibility of passing the certification process for each standard term has been analyzed. Finally, the performance of the DME was tested in the laboratory environment using an IFR6000, which showed that the proposed architecture reached an accuracy of less than 0.23 Nautical mile (Nmi) with 98% probability.

Keywords: avionics, DME, software defined radio, navigation

Procedia PDF Downloads 50
546 Application of the Urban Forest Credit Standard as a Tool for Compensating CO2 Emissions in the Metalworking Industry: A Case Study in Brazil

Authors: Marie Madeleine Sarzi Inacio, Ligiane Carolina Leite Dauzacker, Rodrigo Henriques Lopes Da Silva

Abstract:

The climate changes resulting from human activity have increased interest in more sustainable production practices to reduce and offset pollutant emissions. Brazil, with its vast areas capable of carbon absorption, holds a significant advantage in this context. However, to optimize the country's sustainable potential, it is important to establish a robust carbon market with clear rules for the eligibility and validation of projects aimed at reducing and offsetting Greenhouse Gas (GHG) emissions. In this study, our objective is to conduct a feasibility analysis through a case study to evaluate the implementation of an urban forest credits standard in Brazil, using the Urban Forest Credits (UFC) model implemented in the United States as a reference. Thus, the city of Ribeirão Preto, located in Brazil, was selected to assess the availability of green areas. With the CO2 emissions value from the metalworking industry, it was possible to analyze information in the case study, considering the activity. The QGIS software was used to map potential urban forest areas, which can connect to various types of geospatial databases. Although the chosen municipality has little vegetative coverage, the mapping identified at least eight areas that fit the standard definitions within the delimited urban perimeter. The outlook was positive, and the implementation of projects like Urban Forest Credits (UFC) adapted to the Brazilian reality has great potential to benefit the country in the carbon market and contribute to achieving its Greenhouse Gas (GHG) emission reduction goals.

Keywords: carbon neutrality, metalworking industry, carbon credits, urban forestry credits

Procedia PDF Downloads 51
545 Process Development of pVAX1/lacZ Plasmid DNA Purification Using Design of Experiment

Authors: Asavasereerat K., Teacharsripaitoon T., Tungyingyong P., Charupongrat S., Noppiboon S. Hochareon L., Kitsuban P.

Abstract:

Third generation of vaccines is based on gene therapy where DNA is introduced into patients. The antigenic or therapeutic proteins encoded from transgenes DNA triggers an immune-response to counteract various diseases. Moreover, DNA vaccine offers the customization of its ability on protection and treatment with high stability. The production of DNA vaccines become of interest. According to USFDA guidance for industry, the recommended limits for impurities from host cell are lower than 1%, and the active conformation homogeneity supercoiled DNA, is more than 80%. Thus, the purification strategy using two-steps chromatography has been established and verified for its robustness. Herein, pVax1/lacZ, a pre-approved USFDA DNA vaccine backbone, was used and transformed into E. coli strain DH5α. Three purification process parameters including sample-loading flow rate, the salt concentration in washing and eluting buffer, were studied and the experiment was designed using response surface method with central composite face-centered (CCF) as a model. The designed range of selected parameters was 10% variation from the optimized set point as a safety factor. The purity in the percentage of supercoiled conformation obtained from each chromatography step, AIEX and HIC, were analyzed by HPLC. The response data were used to establish regression model and statistically analyzed followed by Monte Carlo simulation using SAS JMP. The results on the purity of the product obtained from AIEX and HIC are between 89.4 to 92.5% and 88.3 to 100.0%, respectively. Monte Carlo simulation showed that the pVAX1/lacZ purification process is robust with confidence intervals of 0.90 in range of 90.18-91.00% and 95.88-100.00%, for AIEX and HIC respectively.

Keywords: AIEX, DNA vaccine, HIC, puification, response surface method, robustness

Procedia PDF Downloads 183
544 Mutual Fund Anchoring Bias with its Parent Firm Performance: Evidence from Mutual Fund Industry of Pakistan

Authors: Muhammad Tahir

Abstract:

Purpose The purpose of the study is to find anchoring bias behavior in mutual fund return with its parent firm performance in Pakistan. Research Methodology The paper used monthly returns of equity funds whose parent firm exist from 2011 to 2021, along with parent firm return. Proximity to 52-week highest return calculated by dividing fund return by parent firm 52-week highest return. Control variables are also taken and used pannel regression model to estimate our results. For robust results, we also used feasible generalize least square (FGLS) model. Findings The results showed that there exist anchoring biased in mutual fund return with its parent firm performance. The FGLS results reaffirms the same results as obtained from panner regression results. Proximity to 52-week highest Xc is significant in both models. Research Implication Since most of mutual funds has a parent firm, anchoring behavior biased found in mutual fund with its parent firm performance. Practical Implication Mutual fund investors in Pakistan invest in equity funds in which behavioral bias exist, although there might be better opportunity in market. Originality/Value Addition Our research is a pioneer study to investigate anchoring bias in mutual fund return with its parent firm performance. Research limitations Our sample is limited to only 23 equity funds, which has a parent firm and data was available from 2011 to 2021.

Keywords: mutual fund, anchoring bias, 52-week high return, proximity to 52-week high, parent firm performance, pannel regression, FGLS

Procedia PDF Downloads 94