Search results for: classification of matter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3877

Search results for: classification of matter

2857 Developing Serious Games to Increase Children’s Knowledge of Diet and Nutrition

Authors: N. Liu, N. Tuah, D. Ying

Abstract:

This research aims to identify and test whether serious games can help children learn and pick up healthy eating habits. The practical component takes the form of digitalizing an already existing educational board game called “All you can eat” (AYCE), designed with the nutritious subject matter in mind. This time with the added features of online playability, which will widen its availability and accessibility to reach more players compared to the physical iteration. The game will be deployed alongside the conducting of theoretical research, which also involves teachers leading children to play said digital version. The research methodology utilizes two experiments, such as handing out surveys to gather feedback from both the partners and students. The research was carried out in several countries, namely Brunei, Malaysia, and Taiwan. The results are to be used for validating the concept of “serious games,” particularly when tied to the health aspect of the players, which in this case were children. As for the research outcomes, they can be applied to a variety of serious games that are related to health topics more broadly and not simply limited to healthy eating habits alone, adopting a balanced combination of practical and theoretical considerations. The study will also help other researchers in the correlated fields of serious game development and pediatrics to better comprehend the needs of children. On the theoretical side, these findings can enable further technological advancements to be made possible, a case in point being more serious games, to provide the appropriate social support precisely on the matter of health-related issues. Not just individuals but rather communities could benefit from improved health and well-being as a result of the project, which, when done right, will potentially improve their quality of life and have fun while doing it. AYCE will be demonstrated to support a wide range of health issues as a result of this research case.

Keywords: culture heritage, digital games, digitalization, traditional religious culture

Procedia PDF Downloads 75
2856 Multi-Temporal Urban Land Cover Mapping Using Spectral Indices

Authors: Mst Ilme Faridatul, Bo Wu

Abstract:

Multi-temporal urban land cover mapping is of paramount importance for monitoring urban sprawl and managing the ecological environment. For diversified urban activities, it is challenging to map land covers in a complex urban environment. Spectral indices have proved to be effective for mapping urban land covers. To improve multi-temporal urban land cover classification and mapping, we evaluate the performance of three spectral indices, e.g. modified normalized difference bare-land index (MNDBI), tasseled cap water and vegetation index (TCWVI) and shadow index (ShDI). The MNDBI is developed to evaluate its performance of enhancing urban impervious areas by separating bare lands. A tasseled cap index, TCWVI is developed to evaluate its competence to detect vegetation and water simultaneously. The ShDI is developed to maximize the spectral difference between shadows of skyscrapers and water and enhance water detection. First, this paper presents a comparative analysis of three spectral indices using Landsat Enhanced Thematic Mapper (ETM), Thematic Mapper (TM) and Operational Land Imager (OLI) data. Second, optimized thresholds of the spectral indices are imputed to classify land covers, and finally, their performance of enhancing multi-temporal urban land cover mapping is assessed. The results indicate that the spectral indices are competent to enhance multi-temporal urban land cover mapping and achieves an overall classification accuracy of 93-96%.

Keywords: land cover, mapping, multi-temporal, spectral indices

Procedia PDF Downloads 151
2855 Optimizing Agricultural Packaging in Fiji: Strategic Barrier Analysis Using Interpretive Structural Modeling and Cross-Impact Matrix Multiplication Applied to Classification

Authors: R. Ananthanarayanan, S. B. Nakula, D. R. Seenivasagam, J. Naua, B. Sharma

Abstract:

Product packaging is a critical component of production, trade, and marketing, playing numerous vital roles that often go unnoticed by consumers. Packaging is essential for maintaining the shelf life, quality assurance, and safety of both manufactured and agricultural products. For example, harvested produce or processed foods can quickly lose quality and freshness, making secure packaging crucial for preservation and safety throughout the food supply chain. In Fiji, agricultural packaging has primarily been managed by local companies for international trade, with gradual advancements in these practices. To further enhance the industry’s performance, this study examines the challenges and constraints hindering the optimization of agricultural packaging practices in Fiji. The study utilizes Multi-Criteria Decision Making (MCDM) tools, specifically Interpretive Structural Modeling (ISM) and Cross-Impact Matrix Multiplication Applied to Classification (MICMAC). ISM analyzes the hierarchical structure of barriers, categorizing them from the least to the most influential, while MICMAC classifies barriers based on their driving and dependence power. This approach helps identify the interrelationships between barriers, providing valuable insights for policymakers and decision-makers to propose innovative solutions for sustainable development in the agricultural packaging sector, ultimately shaping the future of packaging practices in Fiji.

Keywords: agricultural packaging, barriers, ISM, MICMAC

Procedia PDF Downloads 26
2854 Examining Microbial Decomposition, Carbon Cycling and Storage in Cefni Coastal Salt Marsh, Anglesey Island, Wales, United Kingdom

Authors: Dasat G. S., Christopher F. Tim, J. Dun C.

Abstract:

Salt marshes are known to sequester carbon dioxide from the atmosphere into the soil, but natural and anthropogenic activities could trigger the release of large quantities of centuries of buried carbon dioxide, methane and nitrous oxide (CO2, CH4 and N2O) which are the major greenhouse gases (GHGs) implicated with climate change. Therefore, this study investigated the biogeochemical activities by collecting soil samples from low, mid and high zones of the Cefni salt marsh, within the Maltreat estuary, on the island of Anglesey, north Wales, United Kingdom for a consortium of laboratory based experiments using standard operating protocols (POS) to quantify the soil organic matter contents and the rate of microbial decomposition and carbon storage at the Carbon Capture Laboratory of Bangor University Wales. Results of investigations reveals that the mid zone had 56.23% and 9.98% of soil water and soil organic matter (SOM) contents respectively higher than the low and high zones. Phenol oxidase activity (1193.53µmol dicq g-1 h-1) was highest at the low zone in comparison to the high and mid zones (867.60 and 608.74 µmol dicq g-1 h-1) respectively. Soil phenolic concentration was found to be highest in the mid zone (53.25 µg-1 g-1) when compared with those from the high (15.66 µg-1 g-1) and low (4.18 µg-1 g-1) zones respectively. Activities of hydrolase enzymes showed similar trend for the high and low zones and much lower activities in the mid zone. CO2 flux from the mid zone (6.79 ug g-1 h-1) was significantly greater than those from high (-2.29 ug g-1 h-1) and low (1.30 µg g-1 h-1) zones. Since salt marshes provide essential ecosystem services, their degradation or alteration in whatever form could compromise such ecosystem services and could convert them from net sinks into net sources with consequential effects to the global environment.

Keywords: saltmarsh, decomposition, carbon cycling, enzymes

Procedia PDF Downloads 81
2853 Valence and Arousal-Based Sentiment Analysis: A Comparative Study

Authors: Usama Shahid, Muhammad Zunnurain Hussain

Abstract:

This research paper presents a comprehensive analysis of a sentiment analysis approach that employs valence and arousal as its foundational pillars, in comparison to traditional techniques. Sentiment analysis is an indispensable task in natural language processing that involves the extraction of opinions and emotions from textual data. The valence and arousal dimensions, representing the intensity and positivity/negativity of emotions, respectively, enable the creation of four quadrants, each representing a specific emotional state. The study seeks to determine the impact of utilizing these quadrants to identify distinct emotional states on the accuracy and efficiency of sentiment analysis, in comparison to traditional techniques. The results reveal that the valence and arousal-based approach outperforms other approaches, particularly in identifying nuanced emotions that may be missed by conventional methods. The study's findings are crucial for applications such as social media monitoring and market research, where the accurate classification of emotions and opinions is paramount. Overall, this research highlights the potential of using valence and arousal as a framework for sentiment analysis and offers invaluable insights into the benefits of incorporating specific types of emotions into the analysis. These findings have significant implications for researchers and practitioners in the field of natural language processing, as they provide a basis for the development of more accurate and effective sentiment analysis tools.

Keywords: sentiment analysis, valence and arousal, emotional states, natural language processing, machine learning, text analysis, sentiment classification, opinion mining

Procedia PDF Downloads 98
2852 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps

Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur

Abstract:

The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.

Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion

Procedia PDF Downloads 115
2851 Impact of Biological Treatment Effluent on the Physico-Chemical Quality of a Receiving Stream in Ile-Ife, Southwest Nigeria

Authors: Asibor Godwin, Adeniyi Funsho

Abstract:

This study was carried out to investigate the impact of biological treated effluent on the physico-chemical properties of receiving waterbodies and also to establish its suitability for other purposes. It focused on the changes of some physic-chemical variables as one move away from the point of discharge downstream of the waterbodies. Water samples were collected from 14 sampling stations made up of the untreated effluent, treated effluent and receiving streams (before and after treated effluent discharge) over a period of 6 months spanning the dry and rainy seasons. Analyses were carried out on the following: temperature, turbidity, pH, conductivity, major anions and cation, dissolved oxygen, percentage oxygen Saturation, biological oxygen demand (BOD), solids (total solids, suspended solids and dissolved solids), nitrates, phosphates, organic matter and flow discharge using standard analytical methods. The relationships between investigated sites with regards to their physico-chemical properties were analyzed using student-t statistics. Also changes in the treated effluent receiving streams after treated effluent outfall was discussed fully. The physico-chemical water quality of the receiving water bodies meets most of the general water requirements for both domestic and industrial uses. The untreated effluent quality was shown to be of biological origin based on the biological oxygen demand, chloride, dissolved oxygen, total solids, pH and organic matter. The treated effluent showed significant improvement over the raw untreated effluent based on most parameters assessed. There was a significant difference (p<0.05) between the physico-chemical quality of untreated effluent and the treated effluent for the most of the investigated physico-chemical quality. The difference between the discharged treated effluent and the unimpacted section of the receiving waterbodies was also significant (p<0.05) for the most of the physico-chemical parameters.

Keywords: eflluent, Opa River, physico-chemical, waterbody

Procedia PDF Downloads 260
2850 Distributive Justice through Constitution

Authors: Rohtash

Abstract:

Academically, the concept of Justice in the literature is vast, and theories are voluminous and definitions are numerous but it is very difficult to define. Through the ages, justice has been evolving and developing reasoning that how individuals and communities do the right thing that is just and fair to all in that society. Justice is a relative and dynamic concept, not absolute one. It is different in different societies based on their morality and ethics. The idea of justice cannot arise from a single morality but interaction of competing moralities and contending perspectives. Justice is the conditional and circumstantial term. Therefore, justice takes different meanings in different contexts. Justice is the application of the Laws. It is a values-based concept in order to protect the rights and liberties of the people. It is a socially created concept that has no physical reality. It exists in society on the basis of the spirit of sharing by the communities and members of society. The conception of justice in society or among communities and individuals is based on their social coordination. It can be effective only when people’s judgments are based on collective reasoning. Their behavior is shaped by social values, norms and laws. People must accept, share and respect the set of principles for delivering justice. Thus justice can be a reasonable solution to conflicts and to coordinate behavior in society. The subject matter of distributive justice is the Public Good and societal resources that should be evenly distributed among the different sections of society on the principles developed and established by the State through legislation, public policy and Executive orders. The Socioeconomic transformation of the society is adopted by the constitution within the limit of its morality and gives a new dimension to transformative justice. Therefore, both Procedural and Transformative justice is part of Distributive justice. Distributive justice is purely an economic phenomenon. It concerns the allocation of resources among the communities and individuals. The subject matter of distributive justice is the distribution of rights, responsibilities, burdens and benefits in society on the basis of the capacity and capability of individuals.

Keywords: distributive justice, constitutionalism, institutionalism, constitutional morality

Procedia PDF Downloads 82
2849 Hydrogeological Factors of the Ore Genesis in the Sedimentary Basins

Authors: O. Abramova, L. Abukova, A. Goreva, G. Isaeva

Abstract:

The present work was made for the purpose of evaluating the interstitial water’s role in the mobilization of metal elements of clay deposits and occurrences in sedimentary formation in the hydro-geological basins. The experiments were performed by using a special facility, which allows adjusting the pressure, temperature, and the frequency of the acoustic vibrations. The dates for study were samples of the oil shales (Baltic career, O2kk) and clay rocks, mainly montmorillonite composition (Borehole SG-12000, the depth of selection 1000–3600 m, the Azov-Kuban trough, N1). After interstitial water squeezing from the rock samples, decrease in the original content of the rock forming components including trace metals V, Cr, Co, Ni, Cu, Zn, Zr, Mo, Pb, W, Ti, and others was recorded. The experiments made it possible to evaluate the ore elements output and organic matters with the interstitial waters. Calculations have shown that, in standard conditions, from each ton of the oil shales, 5-6 kg of ore elements and 9-10 kg of organic matter can be escaped. A quantity of matter, migrating from clays in the process of solidification, is changed depending on the lithogenesis stage: more recent unrealized deposits lose more ore and organic materials than the clay rocks, selected from depth over 3000 m. Each ton of clays in the depth interval 1000-1500 m is able to generate 3-5 kg of the ore elements and 6-8 kg of the organic matters. The interstitial waters are a freight forwarder over transferring these matters in the reservoir beds. It was concluded that the interstitial waters which escaped from the study samples are solutions with abnormal high concentrations of the metals and organic matters. In the discharge zones of the sediment basins, such fluids can create paragenetic associations of the sedimentary-catagenetic ore and hydrocarbon mineral resources accumulations.

Keywords: hydrocarbons, ore genesis, paragenesis, pore water

Procedia PDF Downloads 257
2848 Rangeland Monitoring by Computerized Technologies

Authors: H. Arzani, Z. Arzani

Abstract:

Every piece of rangeland has a different set of physical and biological characteristics. This requires the manager to synthesis various information for regular monitoring to define changes trend to get wright decision for sustainable management. So range managers need to use computerized technologies to monitor rangeland, and select. The best management practices. There are four examples of computerized technologies that can benefit sustainable management: (1) Photographic method for cover measurement: The method was tested in different vegetation communities in semi humid and arid regions. Interpretation of pictures of quadrats was done using Arc View software. Data analysis was done by SPSS software using paired t test. Based on the results, generally, photographic method can be used to measure ground cover in most vegetation communities. (2) GPS application for corresponding ground samples and satellite pixels: In two provinces of Tehran and Markazi, six reference points were selected and in each point, eight GPS models were tested. Significant relation among GPS model, time and location with accuracy of estimated coordinates was found. After selection of suitable method, in Markazi province coordinates of plots along four transects in each 6 sites of rangelands was recorded. The best time of GPS application was in the morning hours, Etrex Vista had less error than other models, and a significant relation among GPS model, time and location with accuracy of estimated coordinates was found. (3) Application of satellite data for rangeland monitoring: Focusing on the long term variation of vegetation parameters such as vegetation cover and production is essential. Our study in grass and shrub lands showed that there were significant correlations between quantitative vegetation characteristics and satellite data. So it is possible to monitor rangeland vegetation using digital data for sustainable utilization. (4) Rangeland suitability classification with GIS: Range suitability assessment can facilitate sustainable management planning. Three sub-models of sensitivity to erosion, water suitability and forage production out puts were entered to final range suitability classification model. GIS was facilitate classification of range suitability and produced suitability maps for sheep grazing. Generally digital computers assist range managers to interpret, modify, calibrate or integrating information for correct management.

Keywords: computer, GPS, GIS, remote sensing, photographic method, monitoring, rangeland ecosystem, management, suitability, sheep grazing

Procedia PDF Downloads 365
2847 All-Optical Gamma-Rays and Positrons Source by Ultra-Intense Laser Irradiating an Al Cone

Authors: T. P. Yu, J. J. Liu, X. L. Zhu, Y. Yin, W. Q. Wang, J. M. Ouyang, F. Q. Shao

Abstract:

A strong electromagnetic field with E>1015V/m can be supplied by an intense laser such as ELI and HiPER in the near future. Exposing in such a strong laser field, laser-matter interaction enters into the near quantum electrodynamics (QED) regime and highly non-linear physics may occur during the laser-matter interaction. Recently, the multi-photon Breit-Wheeler (BW) process attracts increasing attention because it is capable to produce abundant positrons and it enhances the positron generation efficiency significantly. Here, we propose an all-optical scheme for bright gamma rays and dense positrons generation by irradiating a 1022 W/cm2 laser pulse onto an Al cone filled with near-critical-density plasmas. Two-dimensional (2D) QED particle-in-cell (PIC) simulations show that, the radiation damping force becomes large enough to compensate for the Lorentz force in the cone, causing radiation-reaction trapping of a dense electron bunch in the laser field. The trapped electrons oscillate in the laser electric field and emits high-energy gamma photons in two ways: (1) nonlinear Compton scattering due to the oscillation of electrons in the laser fields, and (2) Compton backwardscattering resulting from the bunch colliding with the reflected laser by the cone tip. The multi-photon Breit-Wheeler process is thus initiated and abundant electron-positron pairs are generated with a positron density ~1027m-3. The scheme is finally demonstrated by full 3D PIC simulations, which indicate the positron flux is up to 109. This compact gamma ray and positron source may have promising applications in future.

Keywords: BW process, electron-positron pairs, gamma rays emission, ultra-intense laser

Procedia PDF Downloads 258
2846 Stabilization of Spent Engine Oil Contaminated Lateritic Soil Admixed with Cement Kiln Dust for Use as Road Construction Materials

Authors: Johnson Rotimi Oluremi, A. Adedayo Adegbola, A. Samson Adediran, O. Solomon Oladapo

Abstract:

Spent engine oil contains heavy metals and polycyclic aromatic hydrocarbons which contribute to chronic health hazards, poor soil aeration, immobilisation of nutrients and lowering of pH in soil. It affects geotechnical properties of lateritic soil thereby constituting geotechnical and foundation problems. This study is therefore based on the stabilization of spent engine oil (SEO) contaminated lateritic soil using cement kiln dust (CKD) as a mean of restoring it to its pristine state. Geotechnical tests which include sieve analysis, atterberg limit, compaction, California bearing ratio and unconfined compressive strength tests were carried out on the natural, SEO contaminated and CKD stabilized SEO contaminated lateritic soil samples. The natural soil classified as A-2-7 (2) by AASHTO classification and GC according to the Unified Soil Classification System changed to A-4 non-plastic soil due to SEO contaminated even under the influence of CKD it remained unchanged. However, the maximum dry density (MDD) of the SEO contaminated soil increased while the optimum moisture content (OMC) behaved vice versa with the increase in the percentages of CKD. Similarly, the bearing strength of the stabilized SEO contaminated soil measured by California Bearing Ratio (CBR) increased with percentage increment in CKD. In conclusion, spent engine oil has a detrimental effect on the geotechnical properties of the lateritic soil sample but which can be remediated using 10% CKD as a stand alone admixture in stabilizing spent engine oil contaminated soil.

Keywords: spent engine oil, lateritic soil, cement kiln dust, stabilization, compaction, unconfined compressive strength

Procedia PDF Downloads 387
2845 [Keynote Talk]: sEMG Interface Design for Locomotion Identification

Authors: Rohit Gupta, Ravinder Agarwal

Abstract:

Surface electromyographic (sEMG) signal has the potential to identify the human activities and intention. This potential is further exploited to control the artificial limbs using the sEMG signal from residual limbs of amputees. The paper deals with the development of multichannel cost efficient sEMG signal interface for research application, along with evaluation of proposed class dependent statistical approach of the feature selection method. The sEMG signal acquisition interface was developed using ADS1298 of Texas Instruments, which is a front-end interface integrated circuit for ECG application. Further, the sEMG signal is recorded from two lower limb muscles for three locomotions namely: Plane Walk (PW), Stair Ascending (SA), Stair Descending (SD). A class dependent statistical approach is proposed for feature selection and also its performance is compared with 12 preexisting feature vectors. To make the study more extensive, performance of five different types of classifiers are compared. The outcome of the current piece of work proves the suitability of the proposed feature selection algorithm for locomotion recognition, as compared to other existing feature vectors. The SVM Classifier is found as the outperformed classifier among compared classifiers with an average recognition accuracy of 97.40%. Feature vector selection emerges as the most dominant factor affecting the classification performance as it holds 51.51% of the total variance in classification accuracy. The results demonstrate the potentials of the developed sEMG signal acquisition interface along with the proposed feature selection algorithm.

Keywords: classifiers, feature selection, locomotion, sEMG

Procedia PDF Downloads 291
2844 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence

Authors: Mofizul Islam Awwal

Abstract:

Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.

Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence

Procedia PDF Downloads 374
2843 Experimental Investigation for Reducing Emissions in Maritime Industry

Authors: Mahmoud Ashraf Farouk

Abstract:

Shipping transportation is the foremost imperative mode of transportation in universal coordination. At display, more than 2/3 of the full worldwide exchange volume accounts for shipping transportation. Ships are utilized as an implies of marine transportation, introducing large-power diesel motors with exhaust containing nitrogen oxide NOx, sulfur oxide SOx, carbo di-oxide CO₂, particular matter PM10, hydrocarbon HC and carbon mono-oxide CO which are the most dangerous contaminants found in exhaust gas from ships. Ships radiating a large amount of exhaust gases have become a significant cause of pollution in the air in coastal areas, harbors and oceans. Therefore, IMO (the International Maritime Organization) has established rules to reduce this emission. This experiment shows the measurement of the exhaust gases emitted from the Aida IV ship's main engine using marine diesel oil fuel (MDO). The measurement is taken by the Sensonic2000 device on 85% load, which is the main sailing load. Moreover, the paper studies different emission reduction technologies as an alternative fuel, which as liquefied natural gas (LNG) applied to the system and reduction technology which is represented as selective catalytic reduction technology added to the marine diesel oil system (MDO+SCR). The experiment calculated the amount of nitrogen oxide NOx, sulfur oxide SOx, carbon-di-oxide CO₂, particular matter PM10, hydrocarbon HC and carbon mono-oxide CO because they have the most effect on the environment. The reduction technologies are applied on the same ship engine with the same load. Finally, the study found that MDO+SCR is the more efficient technology for the Aida IV ship as a training and supply ship due to low consumption and no need to modify the engine. Just add the SCR system to the exhaust line, which is easy and cheapest. Moreover, the differences between them in the emission are not so big.

Keywords: marine, emissions, reduction, shipping

Procedia PDF Downloads 74
2842 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI

Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De

Abstract:

Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.

Keywords: aquaculture farms, LULC, Mangrove, NDVI

Procedia PDF Downloads 180
2841 Integrating Machine Learning and Rule-Based Decision Models for Enhanced B2B Sales Forecasting and Customer Prioritization

Authors: Wenqi Liu, Reginald Bailey

Abstract:

This study explores an advanced approach to enhancing B2B sales forecasting by integrating machine learning models with a rule-based decision framework. The methodology begins with the development of a machine learning classification model to predict conversion likelihood, aiming to improve accuracy over traditional methods like logistic regression. The classification model's effectiveness is measured using metrics such as accuracy, precision, recall, and F1 score, alongside a feature importance analysis to identify key predictors. Following this, a machine learning regression model is used to forecast sales value, with the objective of reducing mean absolute error (MAE) compared to linear regression techniques. The regression model's performance is assessed using MAE, root mean square error (RMSE), and R-squared metrics, emphasizing feature contribution to the prediction. To bridge the gap between predictive analytics and decision-making, a rule-based decision model is introduced that prioritizes customers based on predefined thresholds for conversion probability and predicted sales value. This approach significantly enhances customer prioritization and improves overall sales performance by increasing conversion rates and optimizing revenue generation. The findings suggest that this combined framework offers a practical, data-driven solution for sales teams, facilitating more strategic decision-making in B2B environments.

Keywords: sales forecasting, machine learning, rule-based decision model, customer prioritization, predictive analytics

Procedia PDF Downloads 14
2840 Soil Degradati̇on Mapping Using Geographic Information System, Remote Sensing and Laboratory Analysis in the Oum Er Rbia High Basin, Middle Atlas, Morocco

Authors: Aafaf El Jazouli, Ahmed Barakat, Rida Khellouk

Abstract:

Mapping of soil degradation is derived from field observations, laboratory measurements, and remote sensing data, integrated quantitative methods to map the spatial characteristics of soil properties at different spatial and temporal scales to provide up-to-date information on the field. Since soil salinity, texture and organic matter play a vital role in assessing topsoil characteristics and soil quality, remote sensing can be considered an effective method for studying these properties. The main objective of this research is to asses soil degradation by combining remote sensing data and laboratory analysis. In order to achieve this goal, the required study of soil samples was taken at 50 locations in the upper basin of Oum Er Rbia in the Middle Atlas in Morocco. These samples were dried, sieved to 2 mm and analyzed in the laboratory. Landsat 8 OLI imagery was analyzed using physical or empirical methods to derive soil properties. In addition, remote sensing can serve as a supporting data source. Deterministic potential (Spline and Inverse Distance weighting) and probabilistic interpolation methods (ordinary kriging and universal kriging) were used to produce maps of each grain size class and soil properties using GIS software. As a result, a correlation was found between soil texture and soil organic matter content. This approach developed in ongoing research will improve the prospects for the use of remote sensing data for mapping soil degradation in arid and semi-arid environments.

Keywords: Soil degradation, GIS, interpolation methods (spline, IDW, kriging), Landsat 8 OLI, Oum Er Rbia high basin

Procedia PDF Downloads 162
2839 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 423
2838 The Outcome of Using Machine Learning in Medical Imaging

Authors: Adel Edwar Waheeb Louka

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery

Procedia PDF Downloads 72
2837 An EEG-Based Scale for Comatose Patients' Vigilance State

Authors: Bechir Hbibi, Lamine Mili

Abstract:

Understanding the condition of comatose patients can be difficult, but it is crucial to their optimal treatment. Consequently, numerous scoring systems have been developed around the world to categorize patient states based on physiological assessments. Although validated and widely adopted by medical communities, these scores still present numerous limitations and obstacles. Even with the addition of additional tests and extensions, these scoring systems have not been able to overcome certain limitations, and it appears unlikely that they will be able to do so in the future. On the other hand, physiological tests are not the only way to extract ideas about comatose patients. EEG signal analysis has helped extensively to understand the human brain and human consciousness and has been used by researchers in the classification of different levels of disease. The use of EEG in the ICU has become an urgent matter in several cases and has been recommended by medical organizations. In this field, the EEG is used to investigate epilepsy, dementia, brain injuries, and many other neurological disorders. It has recently also been used to detect pain activity in some regions of the brain, for the detection of stress levels, and to evaluate sleep quality. In our recent findings, our aim was to use multifractal analysis, a very successful method of handling multifractal signals and feature extraction, to establish a state of awareness scale for comatose patients based on their electrical brain activity. The results show that this score could be instantaneous and could overcome many limitations with which the physiological scales stock. On the contrary, multifractal analysis stands out as a highly effective tool for characterizing non-stationary and self-similar signals. It demonstrates strong performance in extracting the properties of fractal and multifractal data, including signals and images. As such, we leverage this method, along with other features derived from EEG signal recordings from comatose patients, to develop a scale. This scale aims to accurately depict the vigilance state of patients in intensive care units and to address many of the limitations inherent in physiological scales such as the Glasgow Coma Scale (GCS) and the FOUR score. The results of applying version V0 of this approach to 30 patients with known GCS showed that the EEG-based score similarly describes the states of vigilance but distinguishes between the states of 8 sedated patients where the GCS could not be applied. Therefore, our approach could show promising results with patients with disabilities, injected with painkillers, and other categories where physiological scores could not be applied.

Keywords: coma, vigilance state, EEG, multifractal analysis, feature extraction

Procedia PDF Downloads 65
2836 Open Source Knowledge Management Approach to Manage and Disseminate Distributed Content in a Global Enterprise

Authors: Rahul Thakur, Onkar Chandel

Abstract:

Red Hat is the world leader in providing open source software and solutions. A global enterprise, like Red Hat, has unique issues of connecting employees with content because of distributed offices, multiple teams spread across geographies, multiple languages, and different cultures. Employees, of a global company, create content that is distributed across departments, teams, regions, and countries. This makes finding the best content difficult since owners keep iterating on the existing content. When employees are unable to find the content, they end up creating it once again and in the process duplicating existing material and effort. Also, employees may not find the relevant content and spend time reviewing obsolete duplicate, or irrelevant content. On an average, a person spends 15 minutes/day in failed searches that might result in missed business opportunities, employee frustration, and substandard deliverables. Red Hat Knowledge Management Office (KMO) applied 'open source strategy' to solve the above problems. Under the Open Source Strategy, decisions are taken collectively. The strategy aims at accomplishing common goals with the help of communities. The objectives of this initiative were to save employees' time, get them authentic content, improve their content search experience, avoid duplicate content creation, provide context based search, improve analytics, improve content management workflows, automate content classification, and automate content upload. This session will describe open source strategy, its applicability in content management, challenges, recommended solutions, and outcome.

Keywords: content classification, content management, knowledge management, open source

Procedia PDF Downloads 210
2835 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging

Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen

Abstract:

Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.

Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques

Procedia PDF Downloads 98
2834 User-Awareness from Eye Line Tracing During Specification Writing to Improve Specification Quality

Authors: Yoshinori Wakatake

Abstract:

Many defects after the release of software packages are caused due to omissions of sufficient test items in test specifications. Poor test specifications are detected by manual review, which imposes a high human load. The prevention of omissions depends on the end-user awareness of test specification writers. If test specifications were written while envisioning the behavior of end-users, the number of omissions in test items would be greatly reduced. The paper pays attention to the point that writers who can achieve it differ from those who cannot in not only the description richness but also their gaze information. It proposes a method to estimate the degree of user-awareness of writers through the analysis of their gaze information when writing test specifications. We conduct an experiment to obtain the gaze information of a writer of the test specifications. Test specifications are automatically classified using gaze information. In this method, a Random Forest model is constructed for the classification. The classification is highly accurate. By looking at the explanatory variables which turn out to be important variables, we know behavioral features to distinguish test specifications of high quality from others. It is confirmed they are pupil diameter size and the number and the duration of blinks. The paper also investigates test specifications automatically classified with gaze information to discuss features in their writing ways in each quality level. The proposed method enables us to automatically classify test specifications. It also prevents test item omissions, because it reveals writing features that test specifications of high quality should satisfy.

Keywords: blink, eye tracking, gaze information, pupil diameter, quality improvement, specification document, user-awareness

Procedia PDF Downloads 64
2833 Statistical Feature Extraction Method for Wood Species Recognition System

Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof

Abstract:

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images

Procedia PDF Downloads 424
2832 Towards the Production of Least Contaminant Grade Biosolids and Biochar via Mild Acid Pre-treatment

Authors: Ibrahim Hakeem

Abstract:

Biosolids are stabilised sewage sludge produced from wastewater treatment processes. Biosolids contain valuable plant nutrient which facilitates their beneficial reuse in agricultural land. However, the increasing levels of legacy and emerging contaminants such as heavy metals (HMs), PFAS, microplastics, pharmaceuticals, microbial pathogens etc., are restraining the direct land application of biosolids. Pyrolysis of biosolids can effectively degrade microbial and organic contaminants; however, HMs remain a persistent problem with biosolids and their pyrolysis-derived biochar. In this work, we demonstrated the integrated processing of biosolids involving the acid pre-treatment for HMs removal and selective reduction of ash-forming elements followed by the bench-scale pyrolysis of the treated biosolids to produce quality biochar and bio-oil enriched with valuable platform chemicals. The pre-treatment of biosolids using 3% v/v H₂SO₄ at room conditions for 30 min reduced the ash content from 30 wt% in raw biosolids to 15 wt% in the treated sample while removing about 80% of limiting HMs without degrading the organic matter. The preservation of nutrients and reduction of HMs concentration and mobility via the developed hydrometallurgical process improved the grade of the treated biosolids for beneficial land reuse. The co-removal of ash-forming elements from biosolids positively enhanced the fluidised bed pyrolysis of the acid-treated biosolids at 700 ℃. Organic matter devolatilisation was improved by 40%, and the produced biochar had higher surface area (107 m²/g), heating value (15 MJ/kg), fixed carbon (35 wt%), organic carbon retention (66% dry-ash free) compared to the raw biosolids biochar with surface area (56 m²/g), heating value (9 MJ/kg), fixed carbon (20 wt%) and organic carbon retention (50%). Pre-treatment also improved microporous structure development of the biochar and substantially decreased the HMs concentration and bioavailability by at least 50% relative to the raw biosolids biochar. The integrated process is a viable approach to enhancing value recovery from biosolids.

Keywords: biosolids, pyrolysis, biochar, heavy metals

Procedia PDF Downloads 74
2831 Classification for Obstructive Sleep Apnea Syndrome Based on Random Forest

Authors: Cheng-Yu Tsai, Wen-Te Liu, Shin-Mei Hsu, Yin-Tzu Lin, Chi Wu

Abstract:

Background: Obstructive Sleep apnea syndrome (OSAS) is a common respiratory disorder during sleep. In addition, Body parameters were identified high predictive importance for OSAS severity. However, the effects of body parameters on OSAS severity remain unclear. Objective: In this study, the objective is to establish a prediction model for OSAS by using body parameters and investigate the effects of body parameters in OSAS. Methodologies: Severity was quantified as the polysomnography and the mean hourly number of greater than 3% dips in oxygen saturation during examination in a hospital in New Taipei City (Taiwan). Four levels of OSAS severity were classified by the apnea and hypopnea index (AHI) with American Academy of Sleep Medicine (AASM) guideline. Body parameters, including neck circumference, waist size, and body mass index (BMI) were obtained from questionnaire. Next, dividing the collecting subjects into two groups: training and testing groups. The training group was used to establish the random forest (RF) to predicting, and test group was used to evaluated the accuracy of classification. Results: There were 3330 subjects recruited in this study, whom had been done polysomnography for evaluating severity for OSAS. A RF of 1000 trees achieved correctly classified 79.94 % of test cases. When further evaluated on the test cohort, RF showed the waist and BMI as the high import factors in OSAS. Conclusion It is possible to provide patient with prescreening by body parameters which can pre-evaluate the health risks.

Keywords: apnea and hypopnea index, Body parameters, obstructive sleep apnea syndrome, Random Forest

Procedia PDF Downloads 151
2830 Habitat Studies of Etheria elliptica in Some Water Bodies (River Ogbese and Owena Reservoir) in Ondo State, Nigeria

Authors: O. O. Olawusi-Peters, M. O. Adediran, O. A. Ajibare

Abstract:

Etheria elliptica population is declining due to various human activities on the freshwater habitat. This necessitate the habitat study of the mussel in river Ogbese and Owena reservoir in Ondo state, Nigeria in order to know the status of the organism within the ecosystem. Thirty (30) specimens each from River Ogbese and Owena reservoir were sampled between May and August 2012. The meristic variables such as length, breadth, shell thickness and weight of the mussel were measured. Also, some physico-chemical parameters, flow rate and soil profile of the two rivers were studied. In River Ogbese, the weight, length, breadth and thickness variables obtained were; 49.73g, 8.42cm, 3.78cm and 0.53cm respectively. In Owena reservoir, the values were; 111.17g, 8.80cm, 6.64cm, 0.22cm respectively. The condition factor showed that the samples from Owena reservoir (K = 16.33) were healthier than River Ogbese (K = 8.34). Also, the length-weight relationship indicated isometric growth in both water bodies (Ogbese r2 = 0.68; Owena r2 = 0.66). In River Ogbese, the physico-chemical parameters obtained were; temperature (24.3oC), pH (7.12), TDS (72ppm), DO (3.2mg/l), conductivity (145µ), BOD (0.7mg/l). The mean temperature (24.1oC), pH (7.69), TDS (102ppm), DO (3.1mg/l), conductivity (183µ), BOD (0.8mg/l) were obtained from Owena reservoir. The soil samples values obtained from both water bodies are; River Ogbese –phosphorus; 78.78, calcium; 3.60, magnesium; 1.90 and organic matter; 0.17. Owena reservoir - Phosphorus; 3.34, calcium; 4.40, magnesium; 1.20 and organic matter; 0.66. The river flow rate was 0.22m/s for Owena reservoir and 0.26m/s for river Ogbese. The study revealed that Etheria elliptica in Owena reservoir and Ogbese were in good and healthy conditions despite the various human activities on the water bodies. The water quality parameters obtained were within the preferred requirements of the mussels.

Keywords: Etheria elliptica, mussels, Owena reservoir, River Ogbese

Procedia PDF Downloads 507
2829 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 138
2828 The Negative Use of the Concept of Agape Love in the New Testament

Authors: Marny S. Menkes Lemmel

Abstract:

Upon hearing or reading the term agape love in a Christian context, one typically thinks of God's love for people and the type of love people should have for God and others. While C.S. Lewis, a significant propagator of this view, and others with a similar opinion are correct in their knowledge of agape in the New Testament in most occurrences, nonetheless, examples of this term appear in the New Testament having quite a different sense. The New World Encyclopedia, regarding the verb form of agape, 'agapao,' comments that it is occasionally used also in a negative sense, but here and elsewhere, there is no elaboration on the significance of these negative instances. If intensity and sacrifice are the crucial constituents of God's agape love and that of his followers, who are commanded to love as God does, the negative instances of this term in the New Testament conceivably indicate that a person's love for improper recipients is likewise intense and sacrificial. This is significant because one who has chosen to direct such love neither to God nor his "neighbors," but to inanimate things or status, clearly shows his priorities, having decided to put all his energy and resources into them while demeaning those for whom God has required such love, including God himself. It is not merely a matter of a person dividing his agape love among several proper objects of that love, but of directing it toward improper targets. Not to heed God's commands regarding whom to love is to break God's entire law, and not to love whom one should, but to love what one should not, is not merely a matter of indifference, but is disloyalty and loathing. An example of such use of the term agape occurs in Luke 11:43 where the Pharisees do not and cannot love God at the same time as loving a place of honor in the synagogues and greetings in the public arena. The exclamation of their dire peril because of their love for the latter reveals that the previously mentioned love objects are not in God's gamut of proper recipients. Furthermore, it appears to be a logical conclusion that since the Pharisees love the latter, they likewise despise God and those whom God requires his people to love. Conversely, the objects of the Pharisees' love in this verse should be what followers of God ought to despise and avoid. In short, appearances of the use of the verb agapao in a negative context are blatant antitheses to what God expects and should alert the reader or listener to take notice. These negative uses are worthy of further discussion than a brief aside by scholars of their existence without additional comment.

Keywords: agape love, divine commands, focus, new testament context, sacrificial

Procedia PDF Downloads 231