Search results for: framework conditions
13798 Efficient Mercury Sorbent: Activated Carbon and Metal Organic Framework Hybrid
Authors: Yongseok Hong, Kurt Louis Solis
Abstract:
In the present study, a hybrid sorbent using the metal organic framework (MOF), UiO-66, and powdered activated carbon (pAC) is synthesized to remove cationic and anionic metals simultaneously. UiO-66 is an octahedron-shaped MOF with a Zr₆O₄(OH)₄ metal node and 1,4-benzene dicarboxylic acid (BDC) organic linker. Zr-based MOFs are attractive for trace element remediation in wastewaters, because Zr is relatively non-toxic as compared to other classes of MOF and, therefore, it will not cause secondary pollution. Most remediation studies with UiO-66 target anions such as fluoride, but trace element oxyanions such as arsenic, selenium, and antimony have also been investigated. There have also been studies involving mercury removal by UiO-66 derivatives, however these require post-synthetic modifications or have lower effective surface areas. Activated carbon is known for being a readily available, well-studied, effective adsorbent for metal contaminants. Solvothermal method was employed to prepare hybrid sorbent from UiO66 and activated carbon, which could be used to remove mercury and selenium simultaneously. The hybrid sorbent was characterized using FSEM-EDS, FT-IR, XRD, and TGA. The results showed that UiO66 and activated carbon are successfully composited. From BET studies, the hybrid sorbent has a SBET of 1051 m² g⁻¹. Adsorption studies were performed, where the hybrid showed maximum adsorption of 204.63 mg g⁻¹ and 168 mg g⁻¹ for Hg (II) and selenite, respectively, and follows the Langmuir model for both species. Kinetics studies have revealed that the Hg uptake of the hybrid is pseudo-2nd order and has rate constant of 5.6E-05 g mg⁻¹ min⁻¹ and the selenite uptake follows the simplified Elovich model with α = 2.99 mg g⁻¹ min⁻¹, β = 0.032 g mg⁻¹.Keywords: adsorption, flue gas wastewater, mercury, selenite, metal organic framework
Procedia PDF Downloads 17413797 Identification and Understanding of Colloidal Destabilization Mechanisms in Geothermal Processes
Authors: Ines Raies, Eric Kohler, Marc Fleury, Béatrice Ledésert
Abstract:
In this work, the impact of clay minerals on the formation damage of sandstone reservoirs is studied to provide a better understanding of the problem of deep geothermal reservoir permeability reduction due to fine particle dispersion and migration. In some situations, despite the presence of filters in the geothermal loop at the surface, particles smaller than the filter size (<1 µm) may surprisingly generate significant permeability reduction affecting in the long term the overall performance of the geothermal system. Our study is carried out on cores from a Triassic reservoir in the Paris Basin (Feigneux, 60 km Northeast of Paris). Our goal is to first identify the clays responsible for clogging, a mineralogical characterization of these natural samples was carried out by coupling X-Ray Diffraction (XRD), Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Spectroscopy (EDS). The results show that the studied stratigraphic interval contains mostly illite and chlorite particles. Moreover, the spatial arrangement of the clays in the rocks as well as the morphology and size of the particles, suggest that illite is more easily mobilized than chlorite by the flow in the pore network. Thus, based on these results, illite particles were prepared and used in core flooding in order to better understand the factors leading to the aggregation and deposition of this type of clay particles in geothermal reservoirs under various physicochemical and hydrodynamic conditions. First, the stability of illite suspensions under geothermal conditions has been investigated using different characterization techniques, including Dynamic Light Scattering (DLS) and Scanning Transmission Electron Microscopy (STEM). Various parameters such as the hydrodynamic radius (around 100 nm), the morphology and surface area of aggregates were measured. Then, core-flooding experiments were carried out using sand columns to mimic the permeability decline due to the injection of illite-containing fluids in sandstone reservoirs. In particular, the effects of ionic strength, temperature, particle concentration and flow rate of the injected fluid were investigated. When the ionic strength increases, a permeability decline of more than a factor of 2 could be observed for pore velocities representative of in-situ conditions. Further details of the retention of particles in the columns were obtained from Magnetic Resonance Imaging and X-ray Tomography techniques, showing that the particle deposition is nonuniform along the column. It is clearly shown that very fine particles as small as 100 nm can generate significant permeability reduction under specific conditions in high permeability porous media representative of the Triassic reservoirs of the Paris basin. These retention mechanisms are explained in the general framework of the DLVO theoryKeywords: geothermal energy, reinjection, clays, colloids, retention, porosity, permeability decline, clogging, characterization, XRD, SEM-EDS, STEM, DLS, NMR, core flooding experiments
Procedia PDF Downloads 17613796 Framework for Performance Measure of Super Resolution Imaging
Authors: Varsha Hemant Patil, Swati A. Bhavsar, Abolee H. Patil
Abstract:
Image quality assessment plays an important role in image evaluation. This paper aims to present an investigation of classic techniques in use for image quality assessment, especially for super-resolution imaging. Researchers have contributed a lot towards the development of super-resolution imaging techniques. However, not much attention is paid to the development of metrics for testing the performance of developed techniques. In this paper, the study report of existing image quality measures is given. The paper classifies reviewed approaches according to functionality and suitability for super-resolution imaging. Probable modifications and improvements of these to suit super-resolution imaging are presented. The prime goal of the paper is to provide a comprehensive reference source for researchers working towards super-resolution imaging and suggest a better framework for measuring the performance of super-resolution imaging techniques.Keywords: interpolation, MSE, PSNR, SSIM, super resolution
Procedia PDF Downloads 9813795 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control
Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch
Abstract:
As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids
Procedia PDF Downloads 12213794 Mobile Agents-Based Framework for Dynamic Resource Allocation in Cloud Computing
Authors: Safia Rabaaoui, Héla Hachicha, Ezzeddine Zagrouba
Abstract:
Nowadays, cloud computing is becoming the more popular technology to various companies and consumers, which benefit from its increased efficiency, cost optimization, data security, unlimited storage capacity, etc. One of the biggest challenges of cloud computing is resource allocation. Its efficiency directly influences the performance of the whole cloud environment. Finding an effective method to address these critical issues and increase cloud performance was necessary. This paper proposes a mobile agents-based framework for dynamic resource allocation in cloud computing to minimize both the cost of using virtual machines and the makespan. Furthermore, its impact on the best response time and power consumption has been studied. The simulation showed that our method gave better results than here.Keywords: cloud computing, multi-agent system, mobile agent, dynamic resource allocation, cost, makespan
Procedia PDF Downloads 10213793 Competitive Advantage Challenges in the Apparel Manufacturing Industries of South Africa: Application of Porter’s Factor Conditions
Authors: Sipho Mbatha, Anne Mastament-Mason
Abstract:
South African manufacturing global competitiveness was ranked 22nd (out of 38 countries), dropped to 24th in 2013 and is expected to drop further to 25th by 2018. These impacts negatively on the industrialisation project of South Africa. For industrialization to be achieved through labour intensive industries like the Apparel Manufacturing Industries of South Africa (AMISA), South Africa needs to identify and respond to factors negatively impacting on the development of competitive advantage This paper applied factor conditions from Porter’s Diamond Model (1990) to understand the various challenges facing the AMISA. Factor conditions highlighted in Porter’s model are grouped into two groups namely, basic and advance factors. Two AMISA associations representing over 10 000 employees were interviewed. The largest Clothing, Textiles and Leather (CTL) apparel retail group was also interviewed with a government department implementing the industrialisation policy were interviewed The paper points out that while AMISA have basic factor conditions necessary for competitive advantage in the clothing and textiles industries, Advance factor coordination has proven to be a challenging task for the AMISA, Higher Education Institutions (HEIs) and government. Poor infrastructural maintenance has contributed to high manufacturing costs and poor quick response as a result of lack of advanced technologies. The use of Porter’s Factor Conditions as a tool to analyse the sector’s competitive advantage challenges and opportunities has increased knowledge regarding factors that limit the AMISA’s competitiveness. It is therefore argued that other studies on Porter’s Diamond model factors like Demand conditions, Firm strategy, structure and rivalry and Related and supporting industries can be used to analyse the situation of the AMISA for the purposes of improving competitive advantage.Keywords: compliance rule, apparel manufacturing industry, factor conditions, advance skills and South African industrial policy
Procedia PDF Downloads 36213792 A Goal-Driven Crime Scripting Framework
Authors: Hashem Dehghanniri
Abstract:
Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.Keywords: attack modelling, crime commission process, crime script, situational crime prevention
Procedia PDF Downloads 12613791 Identifying Organizational Culture to Implement Knowledge Management: Case Study of BKN, Indonesia
Authors: Maria Margaretha, Elin Cahyaningsih, Dana Indra Sensuse Lukman
Abstract:
One of key success an organization can be seen from its culture. Employee, environment, and so on are factors for organization to achieve goals and build a competitive advantage. Type of organizational culture can be a guide to implementing Knowledge Management (KM) in organization especially in BKN. Culture will determine behavior of employees or environment to support KM. This paper describes the process to decide which culture does organization belong and suggestion and creating strategic moves in the future to implement KM. OCAI (Organizational Culture Assessment Instrument) and its framework (Competing Value Framework) were used to decide the type of organizational culture. To implement KM in organization, clan is an appropriate culture, because clan culture represent cultural values and leader type to implement a successful KM. Result of the measurement will be references for BKN to improve organization culture to achieve its goals and organization effectiveness.Keywords: organizational culture, government, knowledge management, OCAI
Procedia PDF Downloads 62113790 Design of Traffic Counting Android Application with Database Management System and Its Comparative Analysis with Traditional Counting Methods
Authors: Muhammad Nouman, Fahad Tiwana, Muhammad Irfan, Mohsin Tiwana
Abstract:
Traffic congestion has been increasing significantly in major metropolitan areas as a result of increased motorization, urbanization, population growth and changes in the urban density. Traffic congestion compromises efficiency of transport infrastructure and causes multiple traffic concerns; including but not limited to increase of travel time, safety hazards, air pollution, and fuel consumption. Traffic management has become a serious challenge for federal and provincial governments, as well as exasperated commuters. Effective, flexible, efficient and user-friendly traffic information/database management systems characterize traffic conditions by making use of traffic counts for storage, processing, and visualization. While, the emerging data collection technologies continue to proliferate, its accuracy can be guaranteed through the comparison of observed data with the manual handheld counters. This paper presents the design of tablet based manual traffic counting application and framework for development of traffic database management system for Pakistan. The database management system comprises of three components including traffic counting android application; establishing online database and its visualization using Google maps. Oracle relational database was chosen to develop the data structure whereas structured query language (SQL) was adopted to program the system architecture. The GIS application links the data from the database and projects it onto a dynamic map for traffic conditions visualization. The traffic counting device and example of a database application in the real-world problem provided a creative outlet to visualize the uses and advantages of a database management system in real time. Also, traffic data counts by means of handheld tablet/ mobile application can be used for transportation planning and forecasting.Keywords: manual count, emerging data sources, traffic information quality, traffic surveillance, traffic counting device, android; data visualization, traffic management
Procedia PDF Downloads 19313789 MB-Slam: A Slam Framework for Construction Monitoring
Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han
Abstract:
Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.Keywords: perspective alignment, progress monitoring, slam, stereo matching.
Procedia PDF Downloads 22413788 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP
Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost
Abstract:
The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)
Procedia PDF Downloads 42613787 Removal of Aggregates of Monoclonal Antibodies by Ion Exchange Chrmoatography
Authors: Ishan Arora, Anurag Rathore
Abstract:
The primary objective of this work was to study the effect of resin chemistry, pH and molarity of binding and elution buffer on aggregate removal using Cation Exchange Chromatography and find the optimum conditions which can give efficient aggregate removal with minimum loss of yield. Four different resins were used for carrying out the experiments: Fractogel EMD SO3-(S), Fractogel EMD COO-(M), Capto SP ImpRes and S Ceramic HyperD. Runs were carried out on the AKTA Avant system. Design of Experiments (DOE) was used for analysis using the JMP software. The dependence of the yield obtained using different resins on the operating conditions was studied. Success has been achieved in obtaining yield greater than 90% using Capto SP ImpRes and Fractogel EMD COO-(M) resins. It has also been found that a change in the operating conditions generally has different effects on the yields obtained using different resins.Keywords: aggregates, cation exchange chromatography, design of experiments, monoclonal antibodies
Procedia PDF Downloads 26813786 Determinants of Mobile Payment Adoption among Retailers in Ghana
Authors: Ibrahim Masud, Yusheng Kong, Adam Diyawu Rahman
Abstract:
Mobile payment variously referred to as mobile money, mobile money transfer, and mobile wallet refers to payment services operated under financial regulation and performed from or via a mobile device. Mobile payment systems have come to augment and to some extent try to replace the conventional payment methods like cash, cheque, or credit cards. This study examines mobile payment adoption factors among retailers in Ghana. A conceptual framework was adopted from the extant literature using the Technology Acceptance Model and the Theory of Reasoned action as the theoretical bases. Data for the study was obtained from a sample of 240 respondents through a structured questionnaire. The PLS-SEM was used to analyze the data through SPSS v.22 and SmartPLS v.3. The findings indicate that factors such as perceived usefulness, perceived ease of use, perceived security, competitive pressure and facilitating conditions are the main determinants of mobile payment adoption among retailers in Ghana. The study contributes to the literature on mobile payment adoption from developing country context.Keywords: mobile payment, retailers, structural equation modeling, technology acceptance model
Procedia PDF Downloads 17813785 Comparison of the Performance of Diesel Engine, Run with Diesel and Safflower Oil Methyl Esters, Using a Piston Which Has Five Grooves on Its Crown
Authors: N. Hiranmai, M. L. S. Deva Kumar
Abstract:
In this project, it is planned to carry out an experimental investigation on 4- stroke Direct Injection Diesel Engine, which is a single-cylinder, four-stroke, water-cooled, and constant speed engine capable of developing a power output of 3.7 kW at 1500 rpm, run with diesel fuel and also with different proportions of Safflower oil methyl esters, with a piston having five number of grooves on its crown to create turbulence. Various performance parameters, such as brake power, specific fuel consumption, and thermal efficiency, are calculated. At all the load conditions, the performance of the engine is obtained better for blend B40 (40% Safflower oil + 60% of Diesel). At different load conditions, Brake thermal Efficiency (ηbth) is comparatively more for all blends than that for Diesel. At different load conditions, ηith is less for blend B40.Keywords: four-stroke engine, diesel, safflower oil, engine performance, emissions.
Procedia PDF Downloads 9813784 NanoSat MO Framework: Simulating a Constellation of Satellites with Docker Containers
Authors: César Coelho, Nikolai Wiegand
Abstract:
The advancement of nanosatellite technology has opened new avenues for cost-effective and faster space missions. The NanoSat MO Framework (NMF) from the European Space Agency (ESA) provides a modular and simpler approach to the development of flight software and operations of small satellites. This paper presents a methodology using the NMF together with Docker for simulating constellations of satellites. By leveraging Docker containers, the software environment of individual satellites can be easily replicated within a simulated constellation. This containerized approach allows for rapid deployment, isolation, and management of satellite instances, facilitating comprehensive testing and development in a controlled setting. By integrating the NMF lightweight simulator in the container, a comprehensive simulation environment was achieved. A significant advantage of using Docker containers is their inherent scalability, enabling the simulation of hundreds or even thousands of satellites with minimal overhead. Docker's lightweight nature ensures efficient resource utilization, allowing for deployment on a single host or across a cluster of hosts. This capability is crucial for large-scale simulations, such as in the case of mega-constellations, where multiple traditional virtual machines would be impractical due to their higher resource demands. This ability for easy horizontal scaling based on the number of simulated satellites provides tremendous flexibility to different mission scenarios. Our results demonstrate that leveraging Docker containers with the NanoSat MO Framework provides a highly efficient and scalable solution for simulating satellite constellations, offering not only significant benefits in terms of resource utilization and operational flexibility but also enabling testing and validation of ground software for constellations. The findings underscore the importance of taking advantage of already existing technologies in computer science to create new solutions for future satellite constellations in space.Keywords: containerization, docker containers, NanoSat MO framework, satellite constellation simulation, scalability, small satellites
Procedia PDF Downloads 4913783 Programming without Code: An Approach and Environment to Conditions-On-Data Programming
Authors: Philippe Larvet
Abstract:
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation
Procedia PDF Downloads 22113782 Measuring Entrepreneurial Success through Specific Sustainable Development Goals by Linking Entrepreneurship Attitude and Intentions
Authors: Mohit Taneja, Ravi Kiran, S. C. Bose
Abstract:
Entrepreneurs’ role in achieving Sustainable development goals is crucial as the growth potential of any region depends upon the number and the success rate of entrepreneurial firms. This paper is an effort to examine the relationship between Sustainable growth (SG) with Entrepreneurial attitude (EA) and Entrepreneurial intention (EI) in the context of the Indian economy. The mediation effect of EI between EA and SG has been considered. Partial least square (PLS) –Structural Equation Model (SEM) software was used to design the framework. Students enrolled in entrepreneurship courses of higher educational institutes (HEI) of Punjab, Haryana, and the National Capital Region NCR were contacted for data collection. The National Institutional Ranking Framework (NIRF) framework was used in selecting HEIs and data collected from 589 students was considered for analysis. McGee’s multi-dimensional scale for measuring ESE and the scale of Linan & Chen for measuring EI & ES (SG) was used. Results highlight that EA has a strong impact on EI (p≤ 0.001) and EI has a positive and strong relationship with SG (ES) as β value for the same is 0.683 (p≤ 0.001). The current study also reflects the mediating effect of EI among EA and ES, as the results show that the combined β value of both EA and EI (i.e.0.684*0.683= 0.467) is more than the direct influence of EA on ES (β=0.265). EA, with the mediating effect of EI can enhance the opportunity for achieving SG, which suggests that in order to increase the venture success rate and to attain SG, emphasis should be given to EI along with EA. The study has been investigated in three regions of India. Future studies can be extended to other South Asian countries for generalization.Keywords: entrepreneurship, sustainable growth, entrepreneurship intention, entrepreneurship attitude
Procedia PDF Downloads 9413781 Removal of Aggregates of Monoclonal Antibodies by Ion Exchange Chromatography
Authors: Ishan Arora, Anurag Rathore
Abstract:
The primary objective of this work was to study the effect of resin chemistry, pH and molarity of binding and elution buffer on aggregate removal using Cation Exchange Chromatography and find the optimum conditions which can give efficient aggregate removal with minimum loss of yield. Four different resins were used for carrying out the experiments: Fractogel EMD SO3-(S), Fractogel EMD COO-(M), Capto SP ImpRes and S Ceramic HyperD. Runs were carried out on the AKTA Avant system. Design of Experiments (DOE) was used for analysis using the JMP software. The dependence of the yield obtained using different resins on the operating conditions was studied. Success has been achieved by obtaining yield greater than 90% using Capto SP ImpRes and Fractogel EMD COO-(M) resins. It has also been found that a change in the operating conditions generally has different effects on the yields obtained using different resins.Keywords: aggregates, cation exchange chromatography, design of experiments, monoclonal antibodies
Procedia PDF Downloads 25913780 Sorghum Resilience and Sustainability under Limiting and Non-limiting Conditions of Water and Nitrogen
Authors: Muhammad Tanveer Altaf, Mehmet Bedir, Waqas Liaqat, Gönül Cömertpay, Volkan Çatalkaya, Celaluddin Barutçular, Nergiz Çoban, Ibrahim Cerit, Muhammad Azhar Nadeem, Tolga Karaköy, Faheem Shehzad Baloch
Abstract:
Food production needs to be almost double by 2050 in order to feed around 9 billion people around the Globe. Plant production mostly relies on fertilizers, which also have one of the main roles in environmental pollution. In addition to this, climatic conditions are unpredictable, and the earth is expected to face severe drought conditions in the future. Therefore, water and fertilizers, especially nitrogen are considered as main constraints for future food security. To face these challenges, developing integrative approaches for germplasm characterization and selecting the resilient genotypes performing under limiting conditions is very crucial for effective breeding to meet the food requirement under climatic change scenarios. This study is part of the European Research Area Network (ERANET) project for the characterization of the diversity panel of 172 sorghum accessions and six hybrids as control cultivars under limiting (+N/-H2O, -N/+H2O) and non-limiting conditions (+N+H2O). This study was planned to characterize the sorghum diversity in relation to resource Use Efficiency (RUE), with special attention on harnessing the interaction between genotype and environment (GxE) from a physiological and agronomic perspective. Experiments were conducted at Adana, a Mediterranean climate, with augmented design, and data on various agronomic and physiological parameters were recorded. Plentiful diversity was observed in the sorghum diversity panel and significant variations were seen among the limiting water and nitrogen conditions in comparison with the control experiment. Potential genotypes with the best performance are identified under limiting conditions. Whole genome resequencing was performed for whole germplasm under investigation for diversity analysis. GWAS analysis will be performed using genotypic and phenotypic data and linked markers will be identified. The results of this study will show the adaptation and improvement of sorghum under climate change conditions for future food security.Keywords: germplasm, sorghum, drought, nitrogen, resources use efficiency, sequencing
Procedia PDF Downloads 7713779 Radical Web Text Classification Using a Composite-Based Approach
Authors: Kolade Olawande Owoeye, George R. S. Weir
Abstract:
The widespread of terrorism and extremism activities on the internet has become a major threat to the government and national securities due to their potential dangers which have necessitated the need for intelligence gathering via web and real-time monitoring of potential websites for extremist activities. However, the manual classification for such contents is practically difficult or time-consuming. In response to this challenge, an automated classification system called composite technique was developed. This is a computational framework that explores the combination of both semantics and syntactic features of textual contents of a web. We implemented the framework on a set of extremist webpages dataset that has been subjected to the manual classification process. Therein, we developed a classification model on the data using J48 decision algorithm, this is to generate a measure of how well each page can be classified into their appropriate classes. The classification result obtained from our method when compared with other states of arts, indicated a 96% success rate in classifying overall webpages when matched against the manual classification.Keywords: extremist, web pages, classification, semantics, posit
Procedia PDF Downloads 14513778 Governance Framework for an Emerging Trust Ecosystem with a Blockchain-Based Supply Chain
Authors: Ismael Ávila, José Reynaldo F. Filho, Vasco Varanda Picchi
Abstract:
The ever-growing consumer awareness of food provenance in Brazil is driving the creation of a trusted ecosystem around the animal protein supply chain. The traceability and accountability requirements of such an ecosystem demand a blockchain layer to strengthen the weak links in that chain. For that, direct involvement of the companies in the blockchain transactions, including as validator nodes of the network, implies formalizing a partnership with the consortium behind the ecosystem. Yet, their compliance standards usually require that a formal governance structure is in place before they agree with any membership terms. In light of such a strategic role of blockchain governance, the paper discusses a framework for tailoring a governance model for a blockchain-based solution aimed at the meat supply chain and evaluates principles and attributes in terms of their relevance to the development of a robust trust ecosystem.Keywords: blockchain, governance, trust ecosystem, supply chain, traceability
Procedia PDF Downloads 11913777 New Approaches to the Determination of the Time Costs of Movements
Authors: Dana Kristalova
Abstract:
This article deals with geographical conditions in terrain and their effect on the movement of vehicles, their effect on speed and safety of movement of people and vehicles. Finding of the optimal routes outside the communication is studied in the army environment, but it occur in civilian as well, primarily in crisis situation, or by the provision of assistance when natural disasters such as floods, fires, storms, etc. have happened. These movements require the optimization of routes when effects of geographical factors should be included. The most important factor is surface of the terrain. It is based on several geographical factors as are slopes, soil conditions, micro-relief, a type of surface and meteorological conditions. Their mutual impact has been given by coefficient of deceleration. This coefficient can be used for commander´s decision. New approaches and methods of terrain testing, mathematical computing, mathematical statistics or cartometric investigation are necessary parts of this evaluation.Keywords: surface of a terrain, movement of vehicles, geographical factor, optimization of routes
Procedia PDF Downloads 46213776 Mastering Test Automation: Bridging Gaps for Seamless QA
Authors: Rohit Khankhoje
Abstract:
The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.Keywords: automation framework, API integration, test automation, test management tools
Procedia PDF Downloads 7313775 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 12413774 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment
Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane
Abstract:
Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.Keywords: artificial intelligence, computer science, criminal investigation, digital forensics
Procedia PDF Downloads 21213773 Effect of the Tidal Charge Parameter on CMBR Temperature Anisotropies
Authors: Evariste Boj, Jan Schee
Abstract:
We present the temperature anisotropy of the cosmic microwave background radiation due to the inhomogeneity region constructed on a 3-brane in the framework of a Randall-Sundrum one brane immersed into a 5D bulk $AdS_5$ spacetime. We employ the Brane-World Friedmann-Lemaitre-Robertson-Walker (FLRW) cosmological model to describe the cosmic expansion on the brane. The inhomogeneity is modeled by the static, spherically symmetric spacetime that replaces the spherically symmetric part of a dust-filled universe and is connected to the FLRW spacetime through the junction conditions. As the vacuum region expands it induces an additional frequency shift to a CMBR photon passing through this inhomogeneity in comparison to the case of a photon propagating through a pure FLRW spacetime. This frequency shift is associated with the effective temperature change of the CMBR in the corresponding direction. We give an estimate of the CMBR effective temperature changes with the change of the value of the tidal charge parameter.Keywords: CMBR, Randall-Sundrum model, Rees-Sciama effect, Braneworld
Procedia PDF Downloads 21413772 The Urban Project and the Urban Improvement to the Test of the Participation, Case: Project of Modernization of Constantine
Authors: Mouhoubi Nedjima, Sassi Boudemagh Souad
Abstract:
In the framework of the modernization of the city of Constantine, and in order to restore its status as a regional metropolis and introduce it into the network of cities international metropolises, a major urban project was launched: project of modernization and of metropolitanization of the city of Constantine (PMMC). Our research project focuses on the management of the project for the modernization of the city of Constantine (PMMC) focusing on the management of some aspects of the urban project whose participation, with the objective assessment of the managerial approach business. Among the cases revealing taken into account in our research work on the question of participation of actors and their organizations, the operation relating to "the urban improvement in the city of the Brothers FERRAD in the district of Zouaghi". This operation with the objective of improving the living conditions of citizens has faced several challenges and obstacles that have been in major part the factors of its failure. Through this study, we examine the management process and the mode of organization of the actors of the project as well as the level of participation of the citizen to finally propose managerial solutions to conflict situations observed.Keywords: the urban project, the urban improvement, participation, Constantine
Procedia PDF Downloads 40013771 A Systematic Review on Development of a Cost Estimation Framework: A Case Study of Nigeria
Authors: Babatunde Dosumu, Obuks Ejohwomu, Akilu Yunusa-Kaltungo
Abstract:
Cost estimation in construction is often difficult, particularly when dealing with risks and uncertainties, which are inevitable and peculiar to developing countries like Nigeria. Direct consequences of these are major deviations in cost, duration, and quality. The fundamental aim of this study is to develop a framework for assessing the impacts of risk on cost estimation, which in turn causes variabilities between contract sum and final account. This is very important, as initial estimates given to clients should reflect the certain magnitude of consistency and accuracy, which the client builds other planning-related activities upon, and also enhance the capabilities of construction industry professionals by enabling better prediction of the final account from the contract sum. In achieving this, a systematic literature review was conducted with cost variability and construction projects as search string within three databases: Scopus, Web of science, and Ebsco (Business source premium), which are further analyzed and gap(s) in knowledge or research discovered. From the extensive review, it was found that factors causing deviation between final accounts and contract sum ranged between 1 and 45. Besides, it was discovered that a cost estimation framework similar to Building Cost Information Services (BCIS) is unavailable in Nigeria, which is a major reason why initial estimates are very often inconsistent, leading to project delay, abandonment, or determination at the expense of the huge sum of money invested. It was concluded that the development of a cost estimation framework that is adjudged an important tool in risk shedding rather than risk-sharing in project risk management would be a panacea to cost estimation problems, leading to cost variability in the Nigerian construction industry by the time this ongoing Ph.D. research is completed. It was recommended that practitioners in the construction industry should always take into account risk in order to facilitate the rapid development of the construction industry in Nigeria, which should give stakeholders a more in-depth understanding of the estimation effectiveness and efficiency to be adopted by stakeholders in both the private and public sectors.Keywords: cost variability, construction projects, future studies, Nigeria
Procedia PDF Downloads 20913770 Exploring Open Process Innovation: Insights from a Systematic Review and Framework Development
Authors: Saeed Nayeri
Abstract:
This paper explores the feasibility of openness within firms' boundaries during process innovation and identifies the key determinants of open process innovation (OPI). Through a systematic review of 78 research studies published between 2001 and 2024, the author synthesized diverse findings into a comprehensive framework detailing OPI attributes and pillars. The identified OPI attributes encompass themes such as technology intensity, significance, magnitude, and locus of exploitation, while the OPI pillars include mechanisms, partners, achievements, and antecedents. Additionally, the author critically analysed gaps in the literature, proposing future research directions that advocate for a broader methodological approach, increased emphasis on theory development and testing, and more cross-national and cross-sectoral studies to advance understanding in this field.Keywords: open innovation, process innovation, OPI attributes, systematic literature review, organizational openness
Procedia PDF Downloads 6713769 Automated Process Quality Monitoring and Diagnostics for Large-Scale Measurement Data
Authors: Hyun-Woo Cho
Abstract:
Continuous monitoring of industrial plants is one of necessary tasks when it comes to ensuring high-quality final products. In terms of monitoring and diagnosis, it is quite critical and important to detect some incipient abnormal events of manufacturing processes in order to improve safety and reliability of operations involved and to reduce related losses. In this work a new multivariate statistical online diagnostic method is presented using a case study. For building some reference models an empirical discriminant model is constructed based on various past operation runs. When a fault is detected on-line, an on-line diagnostic module is initiated. Finally, the status of the current operating conditions is compared with the reference model to make a diagnostic decision. The performance of the presented framework is evaluated using a dataset from complex industrial processes. It has been shown that the proposed diagnostic method outperforms other techniques especially in terms of incipient detection of any faults occurred.Keywords: data mining, empirical model, on-line diagnostics, process fault, process monitoring
Procedia PDF Downloads 401