Search results for: temporal filter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1878

Search results for: temporal filter

678 Neural Graph Matching for Modification Similarity Applied to Electronic Document Comparison

Authors: Po-Fang Hsu, Chiching Wei

Abstract:

In this paper, we present a novel neural graph matching approach applied to document comparison. Document comparison is a common task in the legal and financial industries. In some cases, the most important differences may be the addition or omission of words, sentences, clauses, or paragraphs. However, it is a challenging task without recording or tracing the whole edited process. Under many temporal uncertainties, we explore the potentiality of our approach to proximate the accurate comparison to make sure which element blocks have a relation of edition with others. In the beginning, we apply a document layout analysis that combines traditional and modern technics to segment layouts in blocks of various types appropriately. Then we transform this issue into a problem of layout graph matching with textual awareness. Regarding graph matching, it is a long-studied problem with a broad range of applications. However, different from previous works focusing on visual images or structural layout, we also bring textual features into our model for adapting this domain. Specifically, based on the electronic document, we introduce an encoder to deal with the visual presentation decoding from PDF. Additionally, because the modifications can cause the inconsistency of document layout analysis between modified documents and the blocks can be merged and split, Sinkhorn divergence is adopted in our neural graph approach, which tries to overcome both these issues with many-to-many block matching. We demonstrate this on two categories of layouts, as follows., legal agreement and scientific articles, collected from our real-case datasets.

Keywords: document comparison, graph matching, graph neural network, modification similarity, multi-modal

Procedia PDF Downloads 172
677 The Research on Diesel Bus Emissions in Ulaanbaatar City: Mongolia

Authors: Tsetsegmaa A., Bayarsuren B., Altantsetseg Ts.

Abstract:

To make the best decision on reducing harmful emissions from buses, we need to have a clear understanding of the current state of their actual emissions. The emissions from city buses running on high sulfur fuel, particularly particulate matter (PM) and nitrogen oxides (NOx) from the exhaust gases of conventional diesel engines, have been studied and measured with and without diesel particulate filter (DPF) in Ulaanbaatar city. The study was conducted by using the PEMS (Portable Emissions Measurement System) and gravimetric method in real traffic conditions. The obtained data were used to determine the actual emission rates and to evaluate the effectiveness of the selected particulate filters. Actual road and daily PM emissions from city buses were determined during the warm and cold seasons. A bus with an average daily mileage of 242 km was found to emit 166.155 g of PM into the city's atmosphere on average per day, with 141.3 g in summer and 175.8 g in winter. The actual PM of the city bus is 0.6866 g/km. The concentration of NOx in the exhaust gas averages 1410.94 ppm. The use of DPF reduced the exhaust gas opacity of 24 buses by an average of 97% and filtered a total of 340.4 kg of soot from these buses over a period of six months. Retrofitting an old conventional diesel engine with cassette-type silicon carbide (SiC) DPF, despite the laboriousness of cleaning, can significantly reduce particulate matter emissions. Innovation: First comprehensive road PM and NOx emission dataset and actual road emissions from public buses have been identified. PM and NOx mathematical model equations have been estimated as a function of the bus technical speed and engine revolution with and without DPF.

Keywords: conventional diesel, silicon carbide, real-time onboard measurements, particulate matter, diesel retrofit, fuel sulphur

Procedia PDF Downloads 157
676 Rural Water Supply Services in India: Developing a Composite Summary Score

Authors: Mimi Roy, Sriroop Chaudhuri

Abstract:

Sustainable water supply is among the basic needs for human development, especially in the rural areas of the developing nations where safe water supply and basic sanitation infrastructure is direly needed. In light of the above, we propose a simple methodology to develop a composite water sustainability index (WSI) to assess the collective performance of the existing rural water supply services (RWSS) in India over time. The WSI will be computed by summarizing the details of all the different varieties of water supply schemes presently available in India comprising of 40 liters per capita per day (lpcd), 55 lpcd, and piped water supply (PWS) per household. The WSI will be computed annually, between 2010 and 2016, to elucidate changes in holistic RWSS performances. Results will be integrated within a robust geospatial framework to identify the ‘hotspots’ (states/districts) which have persistent issues over adequate RWSS coverage and warrant spatially-optimized policy reforms in future to address sustainable human development. Dataset will be obtained from the National Rural Drinking Water Program (NRDWP), operating under the aegis of the Ministry of Drinking Water and Sanitation (MoDWS), at state/district/block levels to offer the authorities a cross-sectional view of RWSS at different levels of administrative hierarchy. Due to simplistic design, complemented by spatio-temporal cartograms, similar approaches can also be adopted in other parts of the world where RWSS need a thorough appraisal.

Keywords: rural water supply services, piped water supply, sustainability, composite index, spatial, drinking water

Procedia PDF Downloads 293
675 The Evaluation of Antioxidant Activity of Aloe Vera (Aloe barbadensis miller)

Authors: R. A. Akande, M. L. Mnisi

Abstract:

Introduction: Aloe vera (Aloe barbadensis miller) flowers are carried in a large candelabra-like flower-head. Aloe barbadensis miller has been known as a traditional herbal medicine for the treatment of many diseases and sicknesses mainly for skin conditions such as sunburns, cold sores and frostbite. It is also used as a fresh food preservative. The main objective of this study is to determine the antioxidant activity of Aloe barbadensis miller. Methodology: The plant material (3g) was separately extracted with 30 mL of solvent with varying polarities (methanol and ethyl acetate)(technical grade, Merck) in 50ml polyester centrifuge tubes. The tubes was be shaken for 30 minutes on a linear shaker and left over night. The supernatant was filtered using a Whitman No. 1 filter paper before being transferred into pre-weighed glass containers. The solvent was allowed to evaporate under a fan in a room to quantify extraction efficacy. The, tin layer chromatography(TLC) plates were prepared and Pasteur pipette was used for spotting each extractant (methanol and ethyl acetate) on the TLC plates and the plate was developed in saturated TLC tank .and dipped in vanillin sulphuric acid mixture and heated at 110 to detect separate compound .and dipped in DDPH in methanol to detect antioxidant. Expected contribution to knowledge: It was observed that different compounds which interact differently with different solvent such as methanol, ethyl acetate having difference polarities were observed. The yellow spots also observed from the plate dipped in DDPH indicate that Aloe barbadensis miller has antioxidant.

Keywords: antioxidant activity, Aloe barbadensis miller, tin layer chromatography, DDPH

Procedia PDF Downloads 445
674 Suspended Sediment Sources Fingerprinting in Ashebeka River Catchment, Assela, Central Ethiopia

Authors: Getachew Mekaa, Bezatu Mengisteb, Tena Alamirewc

Abstract:

Ashebeka River is the main source of drinking water supply for Assela City and its surrounding inhabitants. Apart from seasonal water reliability disruption, the cost of treating water downstream of the river has been increasing over time due to increased pollutants and suspended sediments. Therefore, this research aimed to identify geo-location and prioritize suspended sediment sources in the Ashebeka River catchment using sediment fingerprinting. We collected 58 composite soil samples and a river water sample for suspended sediment samples from the outlet, which were then filtered using Whatman filter paper. The samples were quantified for geochemical tracers with multi-element capability, and inductively coupled plasma-optical emission spectrometry (ICP-OES). Tracers with significant p-value and that passed the Kruskal-Wallis (KW) test were analyzed for stepwise discriminant function analysis (DFA). The DFA results revealed tracers with good discrimination were subsequently used for the mixed model analysis. The relative significant sediment source contributions from sub-catchments (km2): 3, 4, 1, and 2 were estimated as 49.31% (8), 26.71% (5), 23.65% (5.6), and 0.33% (28.4) respectively. The findings of this study will help the water utilities to prioritize areas of intervention, and the approach used could be followed for catchment prioritization in water safety plan development. Moreover, the findings of this research shed light on the integration of sediment fingerprinting into water safety plans to ensure the reliability of drinking water supplies.

Keywords: disruption of drinking water reliability, ashebeka river catchment, sediment fingerprinting, sediment source contribution, mixed model

Procedia PDF Downloads 20
673 Dissection of the Impact of Diabetes Type on Heart Failure across Age Groups: A Systematic Review of Publication Patterns on PubMed

Authors: Nazanin Ahmadi Daryakenari

Abstract:

Background: Diabetes significantly influences the risk of heart failure. The interplay between distinct types of diabetes, heart failure, and their distribution across various age groups remains an area of active exploration. This study endeavors to scrutinize the age group distribution in publications addressing Type 1 and Type 2 diabetes and heart failure on PubMed while also examining the evolving publication trends. Methods: We leveraged E-utilities and RegEx to search and extract publication data from PubMed using various mesh terms. Subsequently, we conducted descriptive statistics and t-tests to discern the differences between the two diabetes types and the distribution across age groups. Finally, we analyzed the temporal trends of publications concerning both types of diabetes and heart failure. Results: Our findings revealed a divergence in the age group distribution between Type 1 and Type 2 diabetes within heart failure publications. Publications discussing Type 2 diabetes and heart failure were more predominant among older age groups, whereas those addressing Type 1 diabetes and heart failure displayed a more balanced distribution across all age groups. The t-test revealed no significant difference in the means between the two diabetes types. However, the number of publications exploring the relationship between Type 2 diabetes and heart failure has seen a steady increase over time, suggesting an escalating interest in this area. Conclusion: The dissection of publication patterns on PubMed uncovers a pronounced association between Type 2 diabetes and heart failure within older age groups. This highlights the critical need to comprehend the distinct age group differences when examining diabetes and heart failure to inform and refine targeted prevention and treatment strategies.

Keywords: Type 1 diabetes, Type 2 diabetes, heart failure, age groups, publication patterns, PubMed

Procedia PDF Downloads 91
672 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants

Authors: Balgaisha Mukanova, Natalya Glazyrina, Sergey Glazyrin

Abstract:

The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.

Keywords: direct problem, multiparametric optimization, optimization parameters, water treatment

Procedia PDF Downloads 385
671 Rainstorm Characteristics over the Northeastern Region of Thailand: Weather Radar Analysis

Authors: P. Intaracharoen, P. Chantraket, C. Detyothin, S. Kirtsaeng

Abstract:

Radar reflectivity data from Phimai weather radar station of DRRAA (Department of Royal Rainmaking and Agricultural Aviation) were used to analyzed the rainstorm characteristics via Thunderstorm Identification Tracking Analysis and Nowcasting (TITAN) algorithm. The Phimai weather radar station was situated at Nakhon Ratchasima province, northeastern Thailand. The data from 277 days of rainstorm events occurring from May 2016 to May 2017 were used to investigate temporal distribution characteristics of convective individual rainclouds. The important storm properties, structures, and their behaviors were analyzed by 9 variables as storm number, storm duration, storm volume, storm area, storm top, storm base, storm speed, storm orientation, and maximum storm reflectivity. The rainstorm characteristics were also examined by separating the data into two periods as wet and dry season followed by an announcement of TMD (Thai Meteorological Department), under the influence of southwest monsoon (SWM) and northeast monsoon (NEM). According to the characteristics of rainstorm results, it can be seen that rainstorms during the SWM influence were found to be the most potential rainstorms over northeastern region of Thailand. The SWM rainstorms are larger number of the storm (404, 140 no./day), storm area (34.09, 26.79 km²) and storm volume (95.43, 66.97 km³) than NEM rainstorms, respectively. For the storm duration, the average individual storm duration during the SWM and NEM was found a minor difference in both periods (47.6, 48.38 min) and almost all storm duration in both periods were less than 3 hours. The storm velocity was not exceeding 15 km/hr (13.34 km/hr for SWM and 10.67 km/hr for NEM). For the rainstorm reflectivity, it was found a little difference between wet and dry season (43.08 dBz for SWM and 43.72 dBz for NEM). It assumed that rainstorms occurred in both seasons have same raindrop size.

Keywords: rainstorm characteristics, weather radar, TITAN, Northeastern Thailand

Procedia PDF Downloads 187
670 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 423
669 Solar-Powered Water Purification Using Ozone and Sand Filtration

Authors: Kayla Youhanaie, Kenneth Dott, Greg Gillis-Smith

Abstract:

Access to clean water is a global challenge that affects nearly one-third of the world’s population. A lack of safe drinking water negatively affects a person’s health, safety, and economic status. However, many regions of the world that face this clean water challenge also have high solar energy potential. To address this worldwide issue and utilize available resources, a solar-powered water purification device was developed that could be implemented in communities around the world that lack access to potable water. The device uses ozone to destroy water-borne pathogens and sand filtration to filter out particulates from the water. To select the best method for this application, a quantitative energy efficiency comparison of three water purification methods was conducted: heat, UV light, and ozone. After constructing an initial prototype, the efficacy of the device was tested using agar petri dishes to test for bacteria growth in treated water samples at various time intervals after applying the device to contaminated water. The results demonstrated that the water purification device successfully removed all bacteria and particulates from the water within three minutes, making it safe for human consumption. These results, as well as the proposed design that utilizes widely available resources in target communities, suggest that the device is a sustainable solution to address the global water crisis and could improve the quality of life for millions of people worldwide.

Keywords: clean water, solar powered water purification, ozonation, sand filtration, global water crisis

Procedia PDF Downloads 70
668 Enhancing Quality Management Systems through Automated Controls and Neural Networks

Authors: Shara Toibayeva, Irbulat Utepbergenov, Lyazzat Issabekova, Aidana Bodesova

Abstract:

The article discusses the importance of quality assessment as a strategic tool in business and emphasizes the significance of the effectiveness of quality management systems (QMS) for enterprises. The evaluation of these systems takes into account the specificity of quality indicators, the multilevel nature of the system, and the need for optimal selection of the number of indicators and evaluation of the system state, which is critical for making rational management decisions. Methods and models of automated enterprise quality management are proposed, including an intelligent automated quality management system integrated with the Management Information and Control System. These systems make it possible to automate the implementation and support of QMS, increasing the validity, efficiency, and effectiveness of management decisions by automating the functions performed by decision makers and personnel. The paper also emphasizes the use of recurrent neural networks to improve automated quality management. Recurrent neural networks (RNNs) are used to analyze and process sequences of data, which is particularly useful in the context of document quality assessment and non-conformance detection in quality management systems. These networks are able to account for temporal dependencies and complex relationships between different data elements, which improves the accuracy and efficiency of automated decisions. The project was supported by a grant from the Ministry of Education and Science of the Republic of Kazakhstan under the Zhas Galym project No. AR 13268939, dedicated to research and development of digital technologies to ensure consistency of QMS regulatory documents.

Keywords: automated control system, quality management, document structure, formal language

Procedia PDF Downloads 31
667 Sterilization Incident Analysis by the Association of Litigation and Risk Management Method

Authors: Souhir Chelly, Asma Ben Cheikh, Hela Ghali, Salwa Khefacha, Lamine Dhidah, Mohamed Ben Rejeb, Houyem Said Latiri

Abstract:

The hospital risk management department is firstly involved in the methodological analysis of grade zero sterilization incidents. The system is based on a subsequent analysis process in compliance with the ongoing requirements of the Haute Autorité de santé (HAS) for a reactive approach to risk, allowing to identify failures and start the appropriate preventive and corrective measures. The use of the association of litigation and risk management (ALARM) method makes easier the grade zero analysis and brings to light the team or institutional, organizational, temporal, individual factors representative of undesirable effects. Two main factors come out again from this analysis, pre-disinfection step of the emergency block unsupervised instrumentalist intern was poorly done since she did not remove the battery from micro air motor. At the sterilization unit, the worker who was not supervised by the nurse did the conditioning of the motor without having checked it if it still contained the battery. The main cause is that the management of human resources was inadequate at both levels, the instrumental trainee in the block who was not supervised by his supervisor and the worker of the sterilization unit who was not supervised by the responsible nurse. There is a lack of research help, advice, and collaboration. The difficulties encountered during this type of analysis are multiple. The first is based on its necessary acceptance by the various actors of care involved, which should not perceive it as a tool leading to individual punishment, but rather as a means to improve their practices.

Keywords: ALARM (Association of Litigation and Risk Management Method), incident, risk management, sterilization

Procedia PDF Downloads 212
666 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI

Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De

Abstract:

Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.

Keywords: aquaculture farms, LULC, Mangrove, NDVI

Procedia PDF Downloads 175
665 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 57
664 The Use of Information and Communication Technologies in Electoral Procedures: Comments on Electronic Voting Security

Authors: Magdalena Musiał-Karg

Abstract:

The expansion of telecommunication and progress of electronic media constitute important elements of our times. The recent worldwide convergence of information and communication technologies (ICT) and dynamic development of the mass media is leading to noticeable changes in the functioning of contemporary states and societies. Currently, modern technologies play more and more important roles and filter down to almost every field of contemporary human life. It results in the growth of online interactions that can be observed by the inconceivable increase in the number of people with home PCs and Internet access. The proof of it is undoubtedly the emergence and use of concepts such as e-society, e-banking, e-services, e-government, e-government, e-participation and e-democracy. The newly coined word e-democracy evidences that modern technologies have also been widely used in politics. Without any doubt in most countries all actors of political market (politicians, political parties, servants in political/public sector, media) use modern forms of communication with the society. Most of these modern technologies progress the processes of getting and sending information to the citizens, communication with the electorate, and also – which seems to be the biggest advantage – electoral procedures. Thanks to implementation of ICT the interaction between politicians and electorate are improved. The main goal of this text is to analyze electronic voting (e-voting) as one of the important forms of electronic democracy in terms of security aspects. The author of this paper aimed at answering the questions of security of electronic voting as an additional form of participation in elections and referenda.

Keywords: electronic democracy, electronic voting, security of e-voting, information and communication technology (ICT)

Procedia PDF Downloads 236
663 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms

Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan

Abstract:

Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving k-means clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.

Keywords: acute leukaemia images, clustering algorithms, image segmentation, moving k-means

Procedia PDF Downloads 286
662 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement

Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini

Abstract:

Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.

Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis

Procedia PDF Downloads 132
661 FACTS Based Stabilization for Smart Grid Applications

Authors: Adel. M. Sharaf, Foad H. Gandoman

Abstract:

Nowadays, Photovoltaic-PV Farms/ Parks and large PV-Smart Grid Interface Schemes are emerging and commonly utilized in Renewable Energy distributed generation. However, PV-hybrid-Dc-Ac Schemes using interface power electronic converters usually has negative impact on power quality and stabilization of modern electrical network under load excursions and network fault conditions in smart grid. Consequently, robust FACTS based interface schemes are required to ensure efficient energy utilization and stabilization of bus voltages as well as limiting switching/fault onrush current condition. FACTS devices are also used in smart grid-Battery Interface and Storage Schemes with PV-Battery Storage hybrid systems as an elegant alternative to renewable energy utilization with backup battery storage for electric utility energy and demand side management to provide needed energy and power capacity under heavy load conditions. The paper presents a robust interface PV-Li-Ion Battery Storage Interface Scheme for Distribution/Utilization Low Voltage Interface using FACTS stabilization enhancement and dynamic maximum PV power tracking controllers. Digital simulation and validation of the proposed scheme is done using MATLAB/Simulink software environment for Low Voltage- Distribution/Utilization system feeding a hybrid Linear-Motorized inrush and nonlinear type loads from a DC-AC Interface VSC-6-pulse Inverter Fed from the PV Park/Farm with a back-up Li-Ion Storage Battery.

Keywords: AC FACTS, smart grid, stabilization, PV-battery storage, Switched Filter-Compensation (SFC)

Procedia PDF Downloads 407
660 Micro-Rest: Extremely Short Breaks in Post-Learning Interference Support Memory Retention over the Long Term

Authors: R. Marhenke, M. Martini

Abstract:

The distraction of attentional resources after learning hinders long-term memory consolidation compared to several minutes of post-encoding inactivity in form of wakeful resting. We tested whether an 8-minute period of wakeful resting, compared to performing an adapted version of the d2 test of attention after learning, supports memory retention. Participants encoded and immediately recalled a word list followed by either an 8 minute period of wakeful resting (eyes closed, relaxed) or by performing an adapted version of the d2 test of attention (scanning and selecting specific characters while ignoring others). At the end of the experimental session (after 12-24 min) and again after 7 days, participants were required to complete a surprise free recall test of both word lists. Our results showed no significant difference in memory retention between the experimental conditions. However, we found that participants who completed the first lines of the d2 test in less than the given time limit of 20 seconds and thus had short unfilled intervals before switching to the next test line, remembered more words over the 12-24 minute and over the 7 days retention interval than participants who did not complete the first lines. This interaction occurred only for the first test lines, with the highest temporal proximity to the encoding task and not for later test lines. Differences in retention scores between groups (completed first line vs. did not complete) seem to be widely independent of the general performance in the d2 test. Implications and limitations of these exploratory findings are discussed.

Keywords: long-term memory, retroactive interference, attention, forgetting

Procedia PDF Downloads 128
659 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: airborne laser scanning, digital terrain models, filtering, forested areas

Procedia PDF Downloads 135
658 Large Eddy Simulation with Energy-Conserving Schemes: Understanding Wind Farm Aerodynamics

Authors: Dhruv Mehta, Alexander van Zuijlen, Hester Bijl

Abstract:

Large Eddy Simulation (LES) numerically resolves the large energy-containing eddies of a turbulent flow, while modelling the small dissipative eddies. On a wind farm, these large scales carry the energy wind turbines extracts and are also responsible for transporting the turbines’ wakes, which may interact with downstream turbines and certainly with the atmospheric boundary layer (ABL). In this situation, it is important to conserve the energy that these wake’s carry and which could be altered artificially through numerical dissipation brought about by the schemes used for the spatial discretisation and temporal integration. Numerical dissipation has been reported to cause the premature recovery of turbine wakes, leading to an over prediction in the power produced by wind farms.An energy-conserving scheme is free from numerical dissipation and ensures that the energy of the wakes is increased or decreased only by the action of molecular viscosity or the action of wind turbines (body forces). The aim is to create an LES package with energy-conserving schemes to simulate wind turbine wakes correctly to gain insight into power-production, wake meandering etc. Such knowledge will be useful in designing more efficient wind farms with minimal wake interaction, which if unchecked could lead to major losses in energy production per unit area of the wind farm. For their research, the authors intend to use the Energy-Conserving Navier-Stokes code developed by the Energy Research Centre of the Netherlands.

Keywords: energy-conserving schemes, modelling turbulence, Large Eddy Simulation, atmospheric boundary layer

Procedia PDF Downloads 463
657 Seawater Intrusion in the Coastal Aquifer of Wadi Nador (Algeria)

Authors: Abdelkader Hachemi & Boualem Remini

Abstract:

Seawater intrusion is a significant challenge faced by coastal aquifers in the Mediterranean basin. This study aims to determine the position of the sharp interface between seawater and freshwater in the aquifer of Wadi Nador, located in the Wilaya of Tipaza, Algeria. A numerical areal sharp interface model using the finite element method is developed to investigate the spatial and temporal behavior of seawater intrusion. The aquifer is assumed to be homogeneous and isotropic. The simulation results are compared with geophysical prospection data obtained through electrical methods in 2011 to validate the model. The simulation results demonstrate a good agreement with the geophysical prospection data, confirming the accuracy of the sharp interface model. The position of the sharp interface in the aquifer is found to be approximately 1617 meters from the sea. Two scenarios are proposed to predict the interface position for the year 2024: one without pumping and the other with pumping. The results indicate a noticeable retreat of the sharp interface position in the first scenario, while a slight decline is observed in the second scenario. The findings of this study provide valuable insights into the dynamics of seawater intrusion in the Wadi Nador aquifer. The predicted changes in the sharp interface position highlight the potential impact of pumping activities on the aquifer's vulnerability to seawater intrusion. This study emphasizes the importance of implementing measures to manage and mitigate seawater intrusion in coastal aquifers. The sharp interface model developed in this research can serve as a valuable tool for assessing and monitoring the vulnerability of aquifers to seawater intrusion.

Keywords: seawater, intrusion, sharp interface, Algeria

Procedia PDF Downloads 70
656 The BNCT Project Using the Cf-252 Source: Monte Carlo Simulations

Authors: Marta Błażkiewicz-Mazurek, Adam Konefał

Abstract:

The project can be divided into three main parts: i. modeling the Cf-252 neutron source and conducting an experiment to verify the correctness of the obtained results, ii. design of the BNCT system infrastructure, iii. analysis of the results from the logical detector. Modeling of the Cf-252 source included designing the shape and size of the source as well as the energy and spatial distribution of emitted neutrons. Two options were considered: a point source and a cylindrical spatial source. The energy distribution corresponded to various spectra taken from specialized literature. Directionally isotropic neutron emission was simulated. The simulation results were compared with experimental values determined using the activation detector method using indium foils and cadmium shields. The relative fluence rate of thermal and resonance neutrons was compared in the chosen places in the vicinity of the source. The second part of the project related to the modeling of the BNCT infrastructure consisted of developing a simulation program taking into account all the essential components of this system. Materials with moderating, absorbing, and backscattering properties of neutrons were adopted into the project. Additionally, a gamma radiation filter was introduced into the beam output system. The analysis of the simulation results obtained using a logical detector located at the beam exit from the BNCT infrastructure included neutron energy and their spatial distribution. Optimization of the system involved changing the size and materials of the system to obtain a suitable collimated beam of thermal neutrons.

Keywords: BNCT, Monte Carlo, neutrons, simulation, modeling

Procedia PDF Downloads 25
655 A Research on the Coordinated Development of Chengdu-Chongqing Economic Circle under the Background of New Urbanization

Authors: Deng Tingting

Abstract:

The coordinated and integrated development of regions is an inevitable requirement for China to move towards high-quality, sustainable development. As one of the regions with the best economic foundation and the strongest economic strength in western China, it is a typical area with national importance and strong network connection characteristics in terms of the comprehensive effect of linking the inland hinterland and connecting the western and national urban networks. The integrated development of the Chengdu-Chongqing economic circle is of great strategic significance for the rapid and high-quality development of the western region. In the context of new urbanization, this paper takes 16 urban units within the economic circle as the research object, based on the 5-year panel data of population, regional economy, and spatial construction and development from 2016 to 2020, using the entropy method and Theil index to analyze the three target layers, and cause analysis. The research shows that there are temporal and spatial differences in the Chengdu-Chongqing economic circle, and there are significant differences between the core city and the surrounding cities. Therefore, by reforming and innovating the regional coordinated development mechanism, breaking administrative barriers, and strengthening the "polar nucleus" radiation function to release the driving force for economic development, especially in the gully areas of economic development belts, not only promote the coordinated development of internal regions but also promote the coordinated and sustainable development of the western region and take a high-quality development path.

Keywords: Chengdu-Chongqing economic circle, new urbanization, coordinated regional development, Theil Index

Procedia PDF Downloads 115
654 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

Authors: U. Datta

Abstract:

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection

Procedia PDF Downloads 130
653 The Evaluation of Gravity Anomalies Based on Global Models by Land Gravity Data

Authors: M. Yilmaz, I. Yilmaz, M. Uysal

Abstract:

The Earth system generates different phenomena that are observable at the surface of the Earth such as mass deformations and displacements leading to plate tectonics, earthquakes, and volcanism. The dynamic processes associated with the interior, surface, and atmosphere of the Earth affect the three pillars of geodesy: shape of the Earth, its gravity field, and its rotation. Geodesy establishes a characteristic structure in order to define, monitor, and predict of the whole Earth system. The traditional and new instruments, observables, and techniques in geodesy are related to the gravity field. Therefore, the geodesy monitors the gravity field and its temporal variability in order to transform the geodetic observations made on the physical surface of the Earth into the geometrical surface in which positions are mathematically defined. In this paper, the main components of the gravity field modeling, (Free-air and Bouguer) gravity anomalies are calculated via recent global models (EGM2008, EIGEN6C4, and GECO) over a selected study area. The model-based gravity anomalies are compared with the corresponding terrestrial gravity data in terms of standard deviation (SD) and root mean square error (RMSE) for determining the best fit global model in the study area at a regional scale in Turkey. The least SD (13.63 mGal) and RMSE (15.71 mGal) were obtained by EGM2008 for the Free-air gravity anomaly residuals. For the Bouguer gravity anomaly residuals, EIGEN6C4 provides the least SD (8.05 mGal) and RMSE (8.12 mGal). The results indicated that EIGEN6C4 can be a useful tool for modeling the gravity field of the Earth over the study area.

Keywords: free-air gravity anomaly, Bouguer gravity anomaly, global model, land gravity

Procedia PDF Downloads 165
652 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery

Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang

Abstract:

Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.

Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram

Procedia PDF Downloads 71
651 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 322
650 Non Linear Stability of Non Newtonian Thin Liquid Film Flowing down an Incline

Authors: Lamia Bourdache, Amar Djema

Abstract:

The effect of non-Newtonian property (power law index n) on traveling waves of thin layer of power law fluid flowing over an inclined plane is investigated. For this, a simplified second-order two-equation model (SM) is used. The complete model is second-order four-equation (CM). It is derived by combining the weighted residual integral method and the lubrication theory. This is due to the fact that at the beginning of the instability waves, a very small number of waves is observed. Using a suitable set of test functions, second order terms are eliminated from the calculus so that the model is still accurate to the second order approximation. Linear, spatial, and temporal stabilities are studied. For travelling waves, a particular type of wave form that is steady in a moving frame, i.e., that travels at a constant celerity without changing its shape is studied. This type of solutions which are characterized by their celerity exists under suitable conditions, when the widening due to dispersion is balanced exactly by the narrowing effect due to the nonlinearity. Changing the parameter of celerity in some range allows exploring the entire spectrum of asymptotic behavior of these traveling waves. The (SM) model is converted into a three dimensional dynamical system. The result is that the model exhibits bifurcation scenarios such as heteroclinic, homoclinic, Hopf, and period-doubling bifurcations for different values of the power law index n. The influence of the non-Newtonian parameter on the nonlinear development of these travelling waves is discussed. It is found at the end that the qualitative characters of bifurcation scenarios are insensitive to the variation of the power law index.

Keywords: inclined plane, nonlinear stability, non-Newtonian, thin film

Procedia PDF Downloads 282
649 Spatio-Temporal Analysis of Land Use and Land Cover Change in the Cocoa Belt of Ondo State, southwestern Nigeria

Authors: Emmanuel Dada, Adebayo-Victoria Tobi Dada

Abstract:

The study evaluates land use and land cover changes in the cocoa belt of Ondo state to quantify its effect on the expanse of land occupied by cocoa plantation as the most suitable region for cocoa raisin in Nigeria. Time series of satellite imagery from Landsat-7 ETM+ and Landsat-8 TIRS covering years 2000 and 2015 respectively were used. The study area was classified into six land use themes of cocoa plantation, settlement, water body, light forest and grassland, forest, and bar surface and rock outcrop. The analyses revealed that out of total land area of 997714 hectares of land of the study area, cocoa plantation land use increased by 10.3% in 2015 from 312260.6 ha in 2000. Forest land use also increased by 6.3% in 2015 from 152144.1 ha in the year 2000, water body reduced from 2954.5 ha in the year 2000 by 0.1% in 2015, settlement land use increased by 3% in 2015 from 15194.6 ha in 2000, light forest and grassland area reduced by 10.4% between 2000 and 2015 and 9.1% reduction in bar surface and rock outcrop land use between the year 2000 and 2015 respectively. The reasons for different ranges in the changes observed in the land use and land cover in the study area could be due to increase in the incentive to cocoa farmers from both government and non-governmental organizations, developed new cocoa breed that thrive better in the light forest, rapid increased in the population of cocoa farmers’ settlements, and government promulgation of forest reserve law.

Keywords: satellite imagery, land use and land cover change, area of land

Procedia PDF Downloads 227