Search results for: real time data processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40633

Search results for: real time data processing

38323 Knowledge Graph Development to Connect Earth Metadata and Standard English Queries

Authors: Gabriel Montague, Max Vilgalys, Catherine H. Crawford, Jorge Ortiz, Dava Newman

Abstract:

There has never been so much publicly accessible atmospheric and environmental data. The possibilities of these data are exciting, but the sheer volume of available datasets represents a new challenge for researchers. The task of identifying and working with a new dataset has become more difficult with the amount and variety of available data. Datasets are often documented in ways that differ substantially from the common English used to describe the same topics. This presents a barrier not only for new scientists, but for researchers looking to find comparisons across multiple datasets or specialists from other disciplines hoping to collaborate. This paper proposes a method for addressing this obstacle: creating a knowledge graph to bridge the gap between everyday English language and the technical language surrounding these datasets. Knowledge graph generation is already a well-established field, although there are some unique challenges posed by working with Earth data. One is the sheer size of the databases – it would be infeasible to replicate or analyze all the data stored by an organization like The National Aeronautics and Space Administration (NASA) or the European Space Agency. Instead, this approach identifies topics from metadata available for datasets in NASA’s Earthdata database, which can then be used to directly request and access the raw data from NASA. By starting with a single metadata standard, this paper establishes an approach that can be generalized to different databases, but leaves the challenge of metadata harmonization for future work. Topics generated from the metadata are then linked to topics from a collection of English queries through a variety of standard and custom natural language processing (NLP) methods. The results from this method are then compared to a baseline of elastic search applied to the metadata. This comparison shows the benefits of the proposed knowledge graph system over existing methods, particularly in interpreting natural language queries and interpreting topics in metadata. For the research community, this work introduces an application of NLP to the ecological and environmental sciences, expanding the possibilities of how machine learning can be applied in this discipline. But perhaps more importantly, it establishes the foundation for a platform that can enable common English to access knowledge that previously required considerable effort and experience. By making this public data accessible to the full public, this work has the potential to transform environmental understanding, engagement, and action.

Keywords: earth metadata, knowledge graphs, natural language processing, question-answer systems

Procedia PDF Downloads 149
38322 Analysis of Maternal Death Surveillance and Response: Causes and Contributing Factors in Addis Ababa, Ethiopia, 2022

Authors: Sisay Tiroro Salato

Abstract:

Background: Ethiopia has been implementing the maternal death surveillance and response system to provide real-time actionable information, including causes of death and contributing factors. Analysis of maternal mortality surveillance data was conducted to identify the causes and underlying factors in Addis Ababa, Ethiopia. Methods: We carried out a retrospective surveillance data analysis of 324 maternal deaths reported in Addis Ababa, Ethiopia, from 2017 to 2021. The data were extracted from the national maternal death surveillance and response database, including information from case investigation, verbal autopsy, and facility extraction forms. The data were analyzed by computing frequency and presented in numbers, proportions, and ratios. Results: Of 324 maternal deaths, 92% died in the health facilities, 6.2% in transit, and 1.5% at home. The mean age at death was 28 years, ranging from 17 to 45. The maternal mortality ratio per 100,000 live births was 77for the five years, ranging from 126 in 2017 to 21 in 2021. The direct and indirect causes of death were responsible for 87% and 13%, respectively. The direct causes included obstetric haemorrhage, hypertensive disorders in pregnancy, puerperal sepsis, embolism, obstructed labour, and abortion. The third delay (delay in receiving care after reaching health facilities) accounted for 57% of deaths, while the first delay (delay in deciding to seek health care) and the second delay (delay in reaching health facilities) and accounted for 34% and 24%, respectively. Late arrival to the referral facility, delayed management after admission, andnon-recognition of danger signs were underlying factors. Conclusion: Over 86% of maternal deaths were attributed by avoidable direct causes. The majority of women do try to reach health services when an emergency occurs, but the third delays present a major problem. Improving the quality of care at the healthcare facility level will help to reduce maternal death.

Keywords: maternal death, surveillance, delays, factors

Procedia PDF Downloads 113
38321 Optimal and Best Timing for Capturing Satellite Thermal Images of Concrete Object

Authors: Toufic Abd El-Latif Sadek

Abstract:

The concrete object represents the concrete areas, like buildings. The best, easy, and efficient extraction of the concrete object from satellite thermal images occurred at specific times during the days of the year, by preventing the gaps in times which give the close and same brightness from different objects. Thus, to achieve the best original data which is the aim of the study and then better extraction of the concrete object and then better analysis. The study was done using seven sample objects, asphalt, concrete, metal, rock, dry soil, vegetation, and water, located at one place carefully investigated in a way that all the objects achieve the homogeneous in acquired data at the same time and same weather conditions. The samples of the objects were on the roof of building at position taking by global positioning system (GPS) which its geographical coordinates is: Latitude= 33 degrees 37 minutes, Longitude= 35 degrees 28 minutes, Height= 600 m. It has been found that the first choice and the best time in February is at 2:00 pm, in March at 4 pm, in April and may at 12 pm, in August at 5:00 pm, in October at 11:00 am. The best time in June and November is at 2:00 pm.

Keywords: best timing, concrete areas, optimal, satellite thermal images

Procedia PDF Downloads 354
38320 Impact of Safety and Quality Considerations of Housing Clients on the Construction Firms’ Intention to Adopt Quality Function Deployment: A Case of Construction Sector

Authors: Saif Ul Haq

Abstract:

The current study intends to examine the safety and quality considerations of clients of housing projects and their impact on the adoption of Quality Function Deployment (QFD) by the construction firm. Mixed method research technique has been used to collect and analyze the data wherein a survey was conducted to collect the data from 220 clients of housing projects in Saudi Arabia. Then, the telephonic and Skype interviews were conducted to collect data of 15 professionals working in the top ten real estate companies of Saudi Arabia. Data were analyzed by using partial least square (PLS) and thematic analysis techniques. Findings reveal that today’s customer prioritizes the safety and quality requirements of their houses and as a result, construction firms adopt QFD to address the needs of customers. The findings are of great importance for the clients of housing projects as well as for the construction firms as they could apply QFD in housing projects to address the safety and quality concerns of their clients.

Keywords: construction industry, quality considerations, quality function deployment, safety considerations

Procedia PDF Downloads 125
38319 Geological Structure Identification in Semilir Formation: An Correlated Geological and Geophysical (Very Low Frequency) Data for Zonation Disaster with Current Density Parameters and Geological Surface Information

Authors: E. M. Rifqi Wilda Pradana, Bagus Bayu Prabowo, Meida Riski Pujiyati, Efraim Maykhel Hagana Ginting, Virgiawan Arya Hangga Reksa

Abstract:

The VLF (Very Low Frequency) method is an electromagnetic method that uses low frequencies between 10-30 KHz which results in a fairly deep penetration. In this study, the VLF method was used for zonation of disaster-prone areas by identifying geological structures in the form of faults. Data acquisition was carried out in Trimulyo Region, Jetis District, Bantul Regency, Special Region of Yogyakarta, Indonesia with 8 measurement paths. This study uses wave transmitters from Japan and Australia to obtain Tilt and Elipt values that can be used to create RAE (Rapat Arus Ekuivalen or Current Density) sections that can be used to identify areas that are easily crossed by electric current. This section will indicate the existence of a geological structure in the form of faults in the study area which is characterized by a high RAE value. In data processing of VLF method, it is obtained Tilt vs Elliptical graph and Moving Average (MA) Tilt vs Moving Average (MA) Elipt graph of each path that shows a fluctuating pattern and does not show any intersection at all. Data processing uses Matlab software and obtained areas with low RAE values that are 0%-6% which shows medium with low conductivity and high resistivity and can be interpreted as sandstone, claystone, and tuff lithology which is part of the Semilir Formation. Whereas a high RAE value of 10% -16% which shows a medium with high conductivity and low resistivity can be interpreted as a fault zone filled with fluid. The existence of the fault zone is strengthened by the discovery of a normal fault on the surface with strike N550W and dip 630E at coordinates X= 433256 and Y= 9127722 so that the activities of residents in the zone such as housing, mining activities and other activities can be avoided to reduce the risk of natural disasters.

Keywords: current density, faults, very low frequency, zonation

Procedia PDF Downloads 175
38318 Satellite Interferometric Investigations of Subsidence Events Associated with Groundwater Extraction in Sao Paulo, Brazil

Authors: B. Mendonça, D. Sandwell

Abstract:

The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.

Keywords: ground subsidence, Interferometric Satellite Aperture Radar (InSAR), metropolitan region of Sao Paulo, water extraction

Procedia PDF Downloads 354
38317 Development of an in vitro Fermentation Chicken Ileum Microbiota Model

Authors: Bello Gonzalez, Setten Van M., Brouwer M.

Abstract:

The chicken small intestine represents a dynamic and complex organ in which the enzymatic digestion and absorption of nutrients take place. The development of an in vitro fermentation chicken small intestinal model could be used as an alternative to explore the interaction between the microbiota and nutrient metabolism and to enhance the efficacy of targeting interventions to improve animal health. In the present study we have developed an in vitro fermentation chicken ileum microbiota model for unrevealing the complex interaction of ileum microbial community under physiological conditions. A two-vessel continuous fermentation process simulating in real-time the physiological conditions of the ileum content (pH, temperature, microaerophilic/anoxic conditions, and peristaltic movements) has been standardized as a proof of concept. As inoculum, we use a pool of ileum microbial community obtained from chicken broilers at the age of day 14. The development and validation of the model provide insight into the initial characterization of the ileum microbial community and its dynamics over time-related to nutrient assimilation and fermentation. Samples can be collected at different time points and can be used to determine the microbial compositional structure, dynamics, and diversity over time. The results of studies using this in vitro model will serve as the foundation for the development of a whole small intestine in vitro fermentation chicken gastrointestinal model to complement our already established in vitro fermentation chicken caeca model. The insight gained from this model could provide us with some information about the nutritional strategies to restore and maintain chicken gut homeostasis. Moreover, the in vitro fermentation model will also allow us to study relationships between gut microbiota composition and its dynamics over time associated with nutrients, antimicrobial compounds, and disease modelling.

Keywords: broilers, in vitro model, ileum microbiota, fermentation

Procedia PDF Downloads 58
38316 R Data Science for Technology Management

Authors: Sunghae Jun

Abstract:

Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.

Keywords: technology management, R system, R data science, statistics, machine learning

Procedia PDF Downloads 458
38315 Time-dependent Association between Recreational Cannabinoid Use and Memory Performance in Healthy Adults: A Neuroimaging Study of Human Connectome Project

Authors: Kamyar Moradi

Abstract:

Background: There is mixed evidence regarding the association between recreational cannabinoid use and memory performance. One of the major reasons for the present controversy is different cannabinoid use-related covariates that influence the cognitive status of an individual. Adjustment of these confounding variables provides accurate insight into the real effects of cannabinoid use on memory status. In this study, we sought to investigate the association between recent recreational cannabinoid use and memory performance while correcting the model for other possible covariates such as demographic characteristics and duration, and amount of cannabinoid use. Methods: Cannabinoid users were assigned to two groups based on the results of THC urine drug screen test (THC+ group: n = 110, THC- group: n = 410). THC urine drug screen test has a high sensitivity and specificity in detecting cannabinoid use in the last 3-4 weeks. The memory domain of NIH Toolbox battery and brain MRI volumetric measures were compared between the groups while adjusting for confounding variables. Results: After Benjamini-Hochberg p-value correction, the performance in all of the measured memory outcomes, including vocabulary comprehension, episodic memory, executive function/cognitive flexibility, processing speed, reading skill, working memory, and fluid cognition, were significantly weaker in THC+ group (p values less than 0.05). Also, volume of gray matter, left supramarginal, right precuneus, right inferior/middle temporal, right hippocampus, left entorhinal, and right pars orbitalis regions were significantly smaller in THC+ group. Conclusions: this study provides evidence regarding the acute effect of recreational cannabis use on memory performance. Further studies are warranted to confirm the results.

Keywords: brain MRI, cannabis, memory, recreational use, THC urine test

Procedia PDF Downloads 197
38314 A Reduced Ablation Model for Laser Cutting and Laser Drilling

Authors: Torsten Hermanns, Thoufik Al Khawli, Wolfgang Schulz

Abstract:

In laser cutting as well as in long pulsed laser drilling of metals, it can be demonstrated that the ablation shape (the shape of cut faces respectively the hole shape) that is formed approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from the ultrashort pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in laser cutting and long pulse drilling of metals is identified, its underlying mechanism numerically implemented, tested and clearly confirmed by comparison with experimental data. In detail, there now is a model that allows the simulation of the temporal (pulse-resolved) evolution of the hole shape in laser drilling as well as the final (asymptotic) shape of the cut faces in laser cutting. This simulation especially requires much less in the way of resources, such that it can even run on common desktop PCs or laptops. Individual parameters can be adjusted using sliders – the simulation result appears in an adjacent window and changes in real time. This is made possible by an application-specific reduction of the underlying ablation model. Because this reduction dramatically decreases the complexity of calculation, it produces a result much more quickly. This means that the simulation can be carried out directly at the laser machine. Time-intensive experiments can be reduced and set-up processes can be completed much faster. The high speed of simulation also opens up a range of entirely different options, such as metamodeling. Suitable for complex applications with many parameters, metamodeling involves generating high-dimensional data sets with the parameters and several evaluation criteria for process and product quality. These sets can then be used to create individual process maps that show the dependency of individual parameter pairs. This advanced simulation makes it possible to find global and local extreme values through mathematical manipulation. Such simultaneous optimization of multiple parameters is scarcely possible by experimental means. This means that new methods in manufacturing such as self-optimization can be executed much faster. However, the software’s potential does not stop there; time-intensive calculations exist in many areas of industry. In laser welding or laser additive manufacturing, for example, the simulation of thermal induced residual stresses still uses up considerable computing capacity or is even not possible. Transferring the principle of reduced models promises substantial savings there, too.

Keywords: asymptotic ablation shape, interactive process simulation, laser drilling, laser cutting, metamodeling, reduced modeling

Procedia PDF Downloads 214
38313 Mindfulness, Reinvestment, and Rowing under Pressure: Evidence for Moderated Moderation of the Anxiety-Performance Relationship

Authors: Katherine Sparks, Christopher Ring

Abstract:

This study aimed to investigate whether dispositional sport-specific mindfulness moderated the moderation effect of conscious processing on the relationship between anxiety and rowing race performance. Using a sport-specific (Rowing-Specific) Reinvestment Scale (RSRS) to measure state conscious processing, we examined the effects of trait sport-related mindfulness on the conscious processes of rowers under competitive racing pressure at a number of UK regattas. 276 rowers completed a survey post competitive race. The survey included the RSRS, mindfulness, a perceived performance rating scale, demographic and race information to identify and record the rower’s actual race performance. Results from the research demonstrated that high levels of dispositional mindfulness are associated with a superior performance under pressure. In relation to the moderating moderation effect, conscious processing amplifies the detrimental effects of anxiety on performance. However, mindfulness, mindful awareness, and mindful non-judgement all proved to attenuate this amplification effect by moderating the conscious processing moderation on the anxiety-performance relationship. Therefore, this study provides initial support for the speculation that dispositional mindfulness can help prevent the deleterious effects of rowing-specific reinvestment under pressure.

Keywords: mindful, reinvestment, under pressure, performance, rowing

Procedia PDF Downloads 156
38312 Analysis of Big Data

Authors: Sandeep Sharma, Sarabjit Singh

Abstract:

As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.

Keywords: big data, unstructured data, volume, variety, velocity

Procedia PDF Downloads 548
38311 Duality of Leagility and Governance: A New Normal Demand Network Management Paradigm under Pandemic

Authors: Jacky Hau

Abstract:

The prevalence of emerging technologies disrupts various industries as well as consumer behavior. Data collection has been in the fingertip and inherited through enabled Internet-of-things (IOT) devices. Big data analytics (BDA) becomes possible and allows real-time demand network management (DNM) through leagile supply chain. To enhance further on its resilience and predictability, governance is going to be examined to promote supply chain transparency and trust in an efficient manner. Leagility combines lean thinking and agile techniques in supply chain management. It aims at reducing costs and waste, as well as maintaining responsiveness to any volatile consumer demand by means of adjusting the decoupling point where the product flow changes from push to pull. Leagility would only be successful when collaborative planning, forecasting, and replenishment (CPFR) process or alike is in place throughout the supply chain business entities. Governance and procurement of the supply chain, however, is crucial and challenging for the execution of CPFR as every entity has to walk-the-talk generously for the sake of overall benefits of supply chain performance, not to mention the complexity of exercising the polices at both of within across various supply chain business entities on account of organizational behavior and mutual trust. Empirical survey results showed that the effective timespan on demand forecasting had been drastically shortening in the magnitude of months to weeks planning horizon, thus agility shall come first and preferably following by lean approach in a timely manner.

Keywords: governance, leagility, procure-to-pay, source-to-contract

Procedia PDF Downloads 111
38310 Influence of Ball Milling Time on Mechanical Properties of Porous Ti-20Nb-5Ag Alloy

Authors: M. J. Shivaram, Shashi Bhushan Arya, Jagannath Nayak, Bharat Bhooshan Panigrahi

Abstract:

Titanium and its alloys have become more significant implant materials due to their mechanical properties, excellent biocompatibility and high corrosion resistance. Biomaterials can be produce by using the powder metallurgy (PM) methods and required properties can tailored by varying the processing parameters, such as ball milling time, space holder particles, and sintering temperature. The desired properties such as, structural and mechanical properties can be obtained by powder metallurgy method.  In the present study, deals with fabrication of solid and porous Ti-20Nb-5Ag alloy using high energy ball milling for different times (5 and 20 h). The resultant powder particles were used to fabricate solid and porous Ti-20Nb-5Ag alloy by adding space holder particles (NH4HCO3). The resultant powder particles, fabricated solid and porous samples were characterized by scanning electron microscopy (SEM). The compressive strength, elastic modulus and microhardness properties were investigated. Solid and porous Ti-20Nb-5Ag alloy samples showed good mechanical properties for 20 h ball milling time as compare to 5 h ball milling.

Keywords: ball milling, compressive strengths, microstructure, porous titanium alloy

Procedia PDF Downloads 300
38309 Prevention of Student Radicalism in School through Civic Education

Authors: Triyanto

Abstract:

Radicalism poses a real threat to Indonesia's future. The target of radicalism is the youth of Indonesia. This is proven by the majority of terrorists are young people. Radicalization is not only a repressive act but also requires educational action. One of the educational efforts is civic education. This study discusses the prevention of radicalism for students through civic education and its constraints. This is qualitative research. Data were collected through literature studies, observations and in-depth interviews. Data were validated by triangulation. The sample of this research is 30 high school students in Surakarta. Data were analyzed by the interactive model of analysis from Miles & Huberman. The results show that (1) civic education can be a way of preventing student radicalism in schools in the form of cultivating the values of education through learning in the classroom and outside the classroom; (2) The obstacles encountered include the lack of learning facilities, the limited ability of teachers and the low attention of students to the civic education.

Keywords: prevention, radicalism, senior high school student, civic education

Procedia PDF Downloads 232
38308 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction

Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey

Abstract:

In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.

Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization

Procedia PDF Downloads 344
38307 Improving the Logistic System to Secure Effective Food Fish Supply Chain in Indonesia

Authors: Atikah Nurhayati, Asep A. Handaka

Abstract:

Indonesia is a world’s major fish producer which can feed not only its citizens but also the people of the world. Currently, the total annual production is 11 tons and expected to double by the year of 2050. Given the potential, fishery has been an important part of the national food security system in Indonesia. Despite such a potential, a big challenge is facing the Indonesians in making fish the reliable source for their food, more specifically source of protein intake. The long geographic distance between the fish production centers and the consumer concentrations has prevented effective supply chain from producers to consumers and therefore demands a good logistic system. This paper is based on our research, which aimed at analyzing the fish supply chain and is to suggest relevant improvement to the chain. The research was conducted in the Year of 2016 in selected locations of Java Island, where intensive transaction on fishery commodities occur. Data used in this research comprises secondary data of time series reports on production and distribution and primary data regarding distribution aspects which were collected through interviews with purposively selected 100 respondents representing fishers, traders and processors. The data were analyzed following the supply chain management framework and processed following logistic regression and validity tests. The main findings of the research are as follows. Firstly, it was found that improperly managed connectivity and logistic chain is the main cause for insecurity of availability and affordability for the consumers. Secondly, lack of quality of most local processed products is a major obstacle for improving affordability and connectivity. The paper concluded with a number of recommended strategies to tackle the problem. These include rationalization of the length of the existing supply chain, intensification of processing activities, and improvement of distribution infrastructure and facilities.

Keywords: fishery, food security, logistic, supply chain

Procedia PDF Downloads 241
38306 A Data-Driven Optimal Control Model for the Dynamics of Monkeypox in a Variable Population with a Comprehensive Cost-Effectiveness Analysis

Authors: Martins Onyekwelu Onuorah, Jnr Dahiru Usman

Abstract:

Introduction: In the realm of public health, the threat posed by Monkeypox continues to elicit concern, prompting rigorous studies to understand its dynamics and devise effective containment strategies. Particularly significant is its recurrence in variable populations, such as the observed outbreak in Nigeria in 2022. In light of this, our study undertakes a meticulous analysis, employing a data-driven approach to explore, validate, and propose optimized intervention strategies tailored to the distinct dynamics of Monkeypox within varying demographic structures. Utilizing a deterministic mathematical model, we delved into the intricate dynamics of Monkeypox, with a particular focus on a variable population context. Our qualitative analysis provided insights into the disease-free equilibrium, revealing its stability when R0 is less than one and discounting the possibility of backward bifurcation, as substantiated by the presence of a single stable endemic equilibrium. The model was rigorously validated using real-time data from the Nigerian 2022 recorded cases for Epi weeks 1 – 52. Transitioning from qualitative to quantitative, we augmented our deterministic model with optimal control, introducing three time-dependent interventions to scrutinize their efficacy and influence on the epidemic's trajectory. Numerical simulations unveiled a pronounced impact of the interventions, offering a data-supported blueprint for informed decision-making in containing the disease. A comprehensive cost-effectiveness analysis employing the Infection Averted Ratio (IAR), Average Cost-Effectiveness Ratio (ACER), and Incremental Cost-Effectiveness Ratio (ICER) facilitated a balanced evaluation of the interventions’ economic and health impacts. In essence, our study epitomizes a holistic approach to understanding and mitigating Monkeypox, intertwining rigorous mathematical modeling, empirical validation, and economic evaluation. The insights derived not only bolster our comprehension of Monkeypox's intricate dynamics but also unveil optimized, cost-effective interventions. This integration of methodologies and findings underscores a pivotal stride towards aligning public health imperatives with economic sustainability, marking a significant contribution to global efforts in combating infectious diseases.

Keywords: monkeypox, equilibrium states, stability, bifurcation, optimal control, cost-effectiveness

Procedia PDF Downloads 86
38305 Mechanical Analysis and Characterization of Friction Stir Processed Aluminium Alloy

Authors: Jaswinder Kumar, Kulbir Singh Sandhu

Abstract:

Friction stir processing (FSP) is a solid-state surface processing technique. A single-pass FSP was performed on Aluminum alloy at combinations of different tool rotational speeds with cylindrical threaded pin profiled tool. The effect of these parameters on tribological properties was studied. The wear resistance is found to be increased from base metal to a single pass FSP sample. The results revealed that with an increase in tool rotational speed, the wear rate increases. The high heat generation causes matrix softening, which results in an increased wear rate; on the other hand, high heat generation leads to coarse grains, which also affected tribological properties. Furthermore, Microstructure results showed that FSPed alloy has a more refined grain structure as compare to the base material, which may be resulted in enhancement of hardness and resistance to wear in FSP.

Keywords: friction stir processing, aluminium alloy, microhardness, microstructure

Procedia PDF Downloads 109
38304 The Effect of User Comments on Traffic Application Usage

Authors: I. Gokasar, G. Bakioglu

Abstract:

With the unprecedented rates of technological improvements, people start to solve their problems with the help of technological tools. According to application stores and websites in which people evaluate and comment on the traffic apps, there are more than 100 traffic applications which have different features with respect to their purpose of usage ranging from the features of traffic apps for public transit modes to the features of traffic apps for private cars. This study focuses on the top 30 traffic applications which were chosen with respect to their download counts. All data about the traffic applications were obtained from related websites. The purpose of this study is to analyze traffic applications in terms of their categorical attributes with the help of developing a regression model. The analysis results suggest that negative interpretations (e.g., being deficient) does not lead to lower star ratings of the applications. However, those negative interpretations result in a smaller increase in star rate. In addition, women use higher star rates than men for the evaluation of traffic applications.

Keywords: traffic app, real–time information, traffic congestion, regression analysis, dummy variables

Procedia PDF Downloads 429
38303 Towards a Distributed Computation Platform Tailored for Educational Process Discovery and Analysis

Authors: Awatef Hicheur Cairns, Billel Gueni, Hind Hafdi, Christian Joubert, Nasser Khelifa

Abstract:

Given the ever changing needs of the job markets, education and training centers are increasingly held accountable for student success. Therefore, education and training centers have to focus on ways to streamline their offers and educational processes in order to achieve the highest level of quality in curriculum contents and managerial decisions. Educational process mining is an emerging field in the educational data mining (EDM) discipline, concerned with developing methods to discover, analyze and provide a visual representation of complete educational processes. In this paper, we present our distributed computation platform which allows different education centers and institutions to load their data and access to advanced data mining and process mining services. To achieve this, we present also a comparative study of the different clustering techniques developed in the context of process mining to partition efficiently educational traces. Our goal is to find the best strategy for distributing heavy analysis computations on many processing nodes of our platform.

Keywords: educational process mining, distributed process mining, clustering, distributed platform, educational data mining, ProM

Procedia PDF Downloads 454
38302 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model

Authors: Didier Auroux, Vladimir Groza

Abstract:

This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.

Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization

Procedia PDF Downloads 316
38301 Application of Granular Computing Paradigm in Knowledge Induction

Authors: Iftikhar U. Sikder

Abstract:

This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.

Keywords: concept approximation, granular computing, reducts, rough set theory, rule induction

Procedia PDF Downloads 531
38300 Design and Evaluation of a Prototype for Non-Invasive Screening of Diabetes – Skin Impedance Technique

Authors: Pavana Basavakumar, Devadas Bhat

Abstract:

Diabetes is a disease which often goes undiagnosed until its secondary effects are noticed. Early detection of the disease is necessary to avoid serious consequences which could lead to the death of the patient. Conventional invasive tests for screening of diabetes are mostly painful, time consuming and expensive. There’s also a risk of infection involved, therefore it is very essential to develop non-invasive methods to screen and estimate the level of blood glucose. Extensive research is going on with this perspective, involving various techniques that explore optical, electrical, chemical and thermal properties of the human body that directly or indirectly depend on the blood glucose concentration. Thus, non-invasive blood glucose monitoring has grown into a vast field of research. In this project, an attempt was made to device a prototype for screening of diabetes by measuring electrical impedance of the skin and building a model to predict a patient’s condition based on the measured impedance. The prototype developed, passes a negligible amount of constant current (0.5mA) across a subject’s index finger through tetra polar silver electrodes and measures output voltage across a wide range of frequencies (10 KHz – 4 MHz). The measured voltage is proportional to the impedance of the skin. The impedance was acquired in real-time for further analysis. Study was conducted on over 75 subjects with permission from the institutional ethics committee, along with impedance, subject’s blood glucose values were also noted, using conventional method. Nonlinear regression analysis was performed on the features extracted from the impedance data to obtain a model that predicts blood glucose values for a given set of features. When the predicted data was depicted on Clarke’s Error Grid, only 58% of the values predicted were clinically acceptable. Since the objective of the project was to screen diabetes and not actual estimation of blood glucose, the data was classified into three classes ‘NORMAL FASTING’,’NORMAL POSTPRANDIAL’ and ‘HIGH’ using linear Support Vector Machine (SVM). Classification accuracy obtained was 91.4%. The developed prototype was economical, fast and pain free. Thus, it can be used for mass screening of diabetes.

Keywords: Clarke’s error grid, electrical impedance of skin, linear SVM, nonlinear regression, non-invasive blood glucose monitoring, screening device for diabetes

Procedia PDF Downloads 325
38299 Design and Implementation of an AI-Enabled Task Assistance and Management System

Authors: Arun Prasad Jaganathan

Abstract:

In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper introduces an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.

Keywords: artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization

Procedia PDF Downloads 59
38298 Optical Variability of Faint Quasars

Authors: Kassa Endalamaw Rewnu

Abstract:

The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.

Keywords: nuclear activity, galaxies, active quasars, variability

Procedia PDF Downloads 82
38297 Impact of Air Flow Structure on Distinct Shape of Differential Pressure Devices

Authors: A. Bertašienė

Abstract:

Energy harvesting from any structure makes a challenge. Different structure of air/wind flows in industrial, environmental and residential applications emerge the real flow investigation in detail. Many of the application fields are hardly achievable to the detailed description due to the lack of up-to-date statistical data analysis. In situ measurements aim crucial investments thus the simulation methods come to implement structural analysis of the flows. Different configurations of testing environment give an overview how important is the simple structure of field in limited area on efficiency of the system operation and the energy output. Several configurations of modeled working sections in air flow test facility was implemented in CFD ANSYS environment to compare experimentally and numerically air flow development stages and forms that make effects on efficiency of devices and processes. Effective form and amount of these flows under different geometry cases define the manner of instruments/devices that measure fluid flow parameters for effective operation of any system and emission flows to define. Different fluid flow regimes were examined to show the impact of fluctuations on the development of the whole volume of the flow in specific environment. The obtained results rise the discussion on how these simulated flow fields are similar to real application ones. Experimental results have some discrepancies from simulation ones due to the models implemented to fluid flow analysis in initial stage, not developed one and due to the difficulties of models to cover transitional regimes. Recommendations are essential for energy harvesting systems in both, indoor and outdoor cases. Further investigations aim to be shifted to experimental analysis of flow under laboratory conditions using state-of-the-art techniques as flow visualization tool and later on to in situ situations that is complicated, cost and time consuming study.

Keywords: fluid flow, initial region, tube coefficient, distinct shape

Procedia PDF Downloads 337
38296 Expression of PGC-1 Alpha Isoforms in Response to Eccentric and Concentric Resistance Training in Healthy Subjects

Authors: Pejman Taghibeikzadehbadr

Abstract:

Background and Aim: PGC-1 alpha is a transcription factor that was first detected in brown adipose tissue. Since its discovery, PGC-1 alpha has been known to facilitate beneficial adaptations such as mitochondrial biogenesis and increased angiogenesis in skeletal muscle following aerobic exercise. Therefore, the purpose of this study was to investigate the expression of PGC-1 alpha isoforms in response to eccentric and concentric resistance training in healthy subjects. Materials and Methods: Ten healthy men were randomly divided into two groups (5 patients in eccentric group - 5 in eccentric group). Isokinetic contraction protocols included eccentric and concentric knee extension with maximum power and angular velocity of 60 degrees per second. The torques assigned to each subject were considered to match the workload in both protocols, with a rotational speed of 60 degrees per second. Contractions consisted of a maximum of 12 sets of 10 repetitions for the right leg, a rest time of 30 seconds between each set. At the beginning and end of the study, biopsy of the lateral broad muscle tissue was performed. Biopsies were performed in both distal and proximal directions of the lateral flank. To evaluate the expression of PGC1α-1 and PGC1α-4 genes, tissue analysis was performed in each group using Real-Time PCR technique. Data were analyzed using dependent t-test and covariance test. SPSS21 software and Exell 2013 software were used for data analysis. Results: The results showed that intra-group changes of PGC1α-1 after one session of activity were not significant in eccentric (p = 0.168) and concentric (p = 0.959) groups. Also, inter-group changes showed no difference between the two groups (p = 0.681). Also, intra-group changes of PGC1α-4 after one session of activity were significant in an eccentric group (p = 0.012) and concentric group (p = 0.02). Also, inter-group changes showed no difference between the two groups (p = 0.362). Conclusion: It seems that the lack of significant changes in the desired variables due to the lack of exercise pressure is sufficient to stimulate the increase of PGC1α-1 and PGC1α-4. And with regard to reviewing the answer, it seems that the compatibility debate has different results that need to be addressed.

Keywords: eccentric contraction, concentric contraction, PGC1α-1 و PGC1α-4, human subject

Procedia PDF Downloads 78
38295 Preliminary WRF SFIRE Simulations over Croatia during the Split Wildfire in July 2017

Authors: Ivana Čavlina Tomašević, Višnjica Vučetić, Maja Telišman Prtenjak, Barbara Malečić

Abstract:

The Split wildfire on the mid-Adriatic Coast in July 2017 is one of the most severe wildfires in Croatian history, given the size and unexpected fire behavior, and it is used in this research as a case study to run the Weather Research and Forecasting Spread Fire (WRF SFIRE) model. This coupled fire-atmosphere model was successfully run for the first time ever for one Croatian wildfire case. Verification of coupled simulations was possible by using the detailed reconstruction of the Split wildfire. Specifically, precise information on ignition time and location, together with mapped fire progressions and spotting within the first 30 hours of the wildfire, was used for both – to initialize simulations and to evaluate the model’s ability to simulate fire’s propagation and final fire scar. The preliminary simulations were obtained using high-resolution vegetation and topography data for the fire area, additionally interpolated to fire grid spacing at 33.3 m. The results demonstrated that the WRF SFIRE model has the ability to work with real data from Croatia and produce adequate results for forecasting fire spread. As the model in its setup has the ability to include and exclude the energy fluxes between the fire and the atmosphere, this was used to investigate possible fire-atmosphere interactions during the Split wildfire. Finally, successfully coupled simulations provided the first numerical evidence that a wildfire from the Adriatic coast region can modify the dynamical structure of the surrounding atmosphere, which agrees with observations from fire grounds. This study has demonstrated that the WRF SFIRE model has the potential for operational application in Croatia with more accurate fire predictions in the future, which could be accomplished by inserting the higher-resolution input data into the model without interpolation. Possible uses for fire management in Croatia include prediction of fire spread and intensity that may vary under changing weather conditions, available fuels and topography, planning effective and safe deployment of ground and aerial firefighting forces, preventing wildland-urban interface fires, effective planning of evacuation routes etc. In addition, the WRF SFIRE model results from this research demonstrated that the model is important for fire weather research and education purposes in order to better understand this hazardous phenomenon that occurs in Croatia.

Keywords: meteorology, agrometeorology, fire weather, wildfires, couple fire-atmosphere model

Procedia PDF Downloads 89
38294 Design and Realization of Double-Delay Line Canceller (DDLC) Using Fpga

Authors: A. E. El-Henawey, A. A. El-Kouny, M. M. Abd –El-Halim

Abstract:

Moving target indication (MTI) which is an anti-clutter technique that limits the display of clutter echoes. It uses the radar received information primarily to display moving targets only. The purpose of MTI is to discriminate moving targets from a background of clutter or slowly-moving chaff particles as shown in this paper. Processing system in these radars is so massive and complex; since it is supposed to perform a great amount of processing in very short time, in most radar applications the response of a single canceler is not acceptable since it does not have a wide notch in the stop-band. A double-delay canceler is an MTI delay-line canceler employing the two-delay-line configuration to improve the performance by widening the clutter-rejection notches, as compared with single-delay cancelers. This canceler is also called a double canceler, dual-delay canceler, or three-pulse canceler. In this paper, a double delay line canceler is chosen for study due to its simplicity in both concept and implementation. Discussing the implementation of a simple digital moving target indicator (DMTI) using FPGA which has distinct advantages compared to other application specific integrated circuit (ASIC) for the purposes of this work. The FPGA provides flexibility and stability which are important factors in the radar application.

Keywords: FPGA, MTI, double delay line canceler, Doppler Shift

Procedia PDF Downloads 644