Search results for: Sentinel-1A data
23243 The Evaluation and Performance of SSRU Employee’s that Influence the Attitude towards Work, Job Satisfaction and Organization Commitment
Authors: Bella Llego
Abstract:
The purpose of this study was to explain and empirically test the influence of attitude towards work, job satisfaction and organizational commitment of SSRU employee’s evaluation and performance. Data used in this study was primary data which were collected through Organizational Commitment Questionnaire with 1-5 Likert Scale. The respondent of this study was 200 managerial and non-managerial staff of SSRU. The statistics to analyze the data provide the descriptive by the mean, standard deviation and test hypothesis by the use of multiple regression. The result of this study is showed that attitude towards work have positive but not significant effect to job satisfaction and employees evaluation and performance. Different with attitude towards work, the organizations commitment has positive and significant influence on job satisfaction and employee performance at SSRU. It means every improvement in organization’s commitment has a positive effect toward job satisfaction and employee evaluation and performance at SSRU.Keywords: attitude towards work, employee’s evaluation and performance, jobs satisfaction, organization commitment
Procedia PDF Downloads 45423242 Data Rate Based Grouping Scheme for Cooperative Communications in Wireless LANs
Authors: Sunmyeng Kim
Abstract:
IEEE 802.11a/b/g standards provide multiple transmission rates, which can be changed dynamically according to the channel condition.Cooperative communications we reintroduced to improve the overallperformance of wireless LANs with the help of relay nodes with higher transmission rates. The cooperative communications are based on the fact that the transmission is much faster when sending data packets to a destination node through a relay node with higher transmission rate, rather than sending data directly to the destination node at low transmission rate. To apply the cooperative communications in wireless LAN, several MAC protocols have been proposed. Some of them can result in collisions among relay nodes in a dense network. In order to solve this problem, we propose a new protocol. Relay nodes are grouped based on their transmission rates. And then, relay nodes only in the highest group try to get channel access. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and collision probability.Keywords: cooperative communications, MAC protocol, relay node, WLAN
Procedia PDF Downloads 46623241 Investigating the Relationship Between Corporate Governance and Financial Performance Considering the Moderating Role of Opinion and Internal Control Weakness
Authors: Fatemeh Norouzi
Abstract:
Today, financial performance has become one of the important issues in accounting and auditing that companies and their managers have paid attention to this issue and for this reason to the variables that are influential in this field. One of the things that can affect financial performance is corporate governance, which is examined in this research, although some things such as issues related to auditing can also moderate this relationship; Therefore, this research has been conducted with the aim of investigating the relationship between corporate governance and financial performance with regard to the moderating role of feedback and internal control weakness. The research is practical in terms of purpose, and in terms of method, it has been done in a post-event descriptive manner, in which the data has been analyzed using stock market data. Data collection has been done by using stock exchange data which has been extracted from the website of the Iraqi Stock Exchange, the statistical population of this research is all the companies admitted to the Iraqi Stock Exchange. . The statistical sample in this research is considered from 2014 to 2021, which includes 34 companies. Four different models have been considered for the research hypotheses, which are eight hypotheses, in this research, the analysis has been done using EXCEL and STATA15 software. In this article, collinearity test, integration test ,determination of fixed effects and correlation matrix results, have been used. The research results showed that the first four hypotheses were rejected and the second four hypotheses were confirmed.Keywords: size of the board of directors, duality of the CEO, financial performance, internal control weakness
Procedia PDF Downloads 8823240 Problems of Boolean Reasoning Based Biclustering Parallelization
Authors: Marcin Michalak
Abstract:
Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.Keywords: Boolean reasoning, biclustering, parallelization, prime implicant
Procedia PDF Downloads 12523239 Assessment of Quality of Drinking Water in Residential Houses of Kuwait by Using GIS Method
Authors: Huda Aljabi
Abstract:
The existence of heavy metals similar to cadmium, arsenic, lead and mercury in the drinking water be able to be a threat to public health. The amount of the substances of these heavy metals in drinking water has expected importance. The National Primary Drinking Water Regulations have set limits for the concentrations of these elements in drinking water because of their toxicity. Furthermore, bromate shaped during the disinfection of drinking water by Ozonation can also be a health hazard. The Paper proposed here will concentrate on the compilation of all available data and information on the presence of trace metals and bromate in the drinking water at residential houses distributed over different areas in Kuwait. New data will also be collected through a sampling of drinking water at some of the residential houses present in different areas of Kuwait and their analysis for the contents of trace metals and bromate. The collected data will be presented on maps showing the distribution of these metals and bromate in the drinking water of Kuwait. Correlation among different chemical parameters will also be investigated using the GRAPHER software. This will help both the Ministry of Electricity and Water (MEW) and the Ministry of Health (MOH) in taking corrective measures and also in planning the infrastructure activities for the future.Keywords: bromate, ozonation, GIS, heavy metals
Procedia PDF Downloads 17723238 Comparison of Slope Data between Google Earth and the Digital Terrain Model, for Registration in Car
Authors: André Felipe Gimenez, Flávia Alessandra Ribeiro da Silva, Roberto Saverio Souza Costa
Abstract:
Currently, the rural producer has been facing problems regarding environmental regularization, which is precisely why the CAR (Rural Environmental Registry) was created. CAR is an electronic registry for rural properties with the purpose of assimilating notions about legal reserve areas, permanent preservation areas, areas of limited use, stable areas, forests and remnants of native vegetation, and all rural properties in Brazil. . The objective of this work was to evaluate and compare altimetry and slope data from google Earth with a digital terrain model (MDT) generated by aerophotogrammetry, in three plots of a steep slope, for the purpose of declaration in the CAR (Rural Environmental Registry). The realization of this work is justified in these areas, in which rural landowners have doubts about the reliability of the use of the free software Google Earth to diagnose inclinations greater than 25 degrees, as recommended by federal law 12651/2012. Added to the fact that in the literature, there is a deficiency of this type of study for the purpose of declaration of the CAR. The results showed that when comparing the drone altimetry data with the Google Earth image data, in areas of high slope (above 40% slope), Google underestimated the real values of terrain slope. Thus, it is concluded that Google Earth is not reliable for diagnosing areas with an inclination greater than 25 degrees (46% declivity) for the purpose of declaration in the CAR, being essential to carry out the local topographic survey.Keywords: MDT, drone, RPA, SiCar, photogrammetry
Procedia PDF Downloads 13123237 Understanding and Political Participation in Constitutional Monarchy of Dusit District Residents
Authors: Sudaporn Arundee
Abstract:
The purposes of this research were to study in three areas: (1) to study political understanding and participating of the constitutional monarchy, (2) to study the level of participation. This paper drew upon data collected from 395 Dusit residents by using questionnaire. In addition, a simple random sampling was utilized to collect data. The findings revealed that 94 percent of respondents had a very good understanding of constitution monarchy with a mean of 4.8. However, the respondents overall had a very low level of participation with the mean score of 1.69 and standard deviation of .719.Keywords: political participation, constitutional monarchy, management and social sciences
Procedia PDF Downloads 25123236 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs
Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour
Procedia PDF Downloads 9523235 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization
Authors: Hebberly Ahatlan
Abstract:
The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, IT/OT convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.Keywords: digitalization, IT/OT convergence, semantic interoperability, VPP, energy blockchain
Procedia PDF Downloads 18423234 Dynamic Compensation for Environmental Temperature Variation in the Coolant Refrigeration Cycle as a Means of Increasing Machine-Tool Precision
Authors: Robbie C. Murchison, Ibrahim Küçükdemiral, Andrew Cowell
Abstract:
Thermal effects are the largest source of dimensional error in precision machining, and a major proportion is caused by ambient temperature variation. The use of coolant is a primary means of mitigating these effects, but there has been limited work on coolant temperature control. This research critically explored whether CNC-machine coolant refrigeration systems adapted to actively compensate for ambient temperature variation could increase machining accuracy. Accuracy data were collected from operators’ checklists for a CNC 5-axis mill and statistically reduced to bias and precision metrics for observations of one day over a sample period of 27 days. Temperature data were collected using three USB dataloggers in ambient air, the chiller inflow, and the chiller outflow. The accuracy and temperature data were analysed using Pearson correlation, then the thermodynamics of the system were described using system identification with MATLAB. It was found that 75% of thermal error is reflected in the hot coolant temperature but that this is negligibly dependent on ambient temperature. The effect of the coolant refrigeration process on hot coolant outflow temperature was also found to be negligible. Therefore, the evidence indicated that it would not be beneficial to adapt coolant chillers to compensate for ambient temperature variation. However, it is concluded that hot coolant outflow temperature is a robust and accessible source of thermal error data which could be used for prevention strategy evaluation or as the basis of other thermal error strategies.Keywords: CNC manufacturing, machine-tool, precision machining, thermal error
Procedia PDF Downloads 8923233 The First Transcriptome Assembly of Marama Bean: An African Orphan Crop
Authors: Ethel E. Phiri, Lionel Hartzenberg, Percy Chimwamuromba, Emmanuel Nepolo, Jens Kossmann, James R. Lloyd
Abstract:
Orphan crops are underresearched and underutilized food plant species that have not been categorized as major food crops, but have the potential to be economically and agronomically significant. They have been documented to have the ability to tolerate extreme environmental conditions. However, limited research has been conducted to uncover their potential as food crop species. The New Partnership for Africa’s Development (NEPAD) has classified Marama bean, Tylosema esculentum, as an orphan crop. The plant is one of the 101 African orphan crops that must have their genomes sequenced, assembled, and annotated in the foreseeable future. Marama bean is a perennial leguminous plant that primarily grows in poor, arid soils in southern Africa. The plants produce large tubers that can weigh as much as 200kg. While the foliage provides fodder, the tuber is carbohydrate rich and is a staple food source for rural communities in Namibia. Also, the edible seeds are protein- and oil-rich. Marama Bean plants respond rapidly to increased temperatures and severe water scarcity without extreme consequences. Advances in molecular biology and biotechnology have made it possible to effectively transfer technologies between model- and major crops to orphan crops. In this research, the aim was to assemble the first transcriptomic analysis of Marama Bean RNA-sequence data. Many model plant species have had their genomes sequenced and their transcriptomes assembled. Therefore the availability of transcriptome data for a non-model crop plant species will allow for gene identification and comparisons between various species. The data has been sequenced using the Ilumina Hiseq 2500 sequencing platform. Data analysis is underway. In essence, this research will eventually evaluate the potential use of Marama Bean as a crop species to improve its value in agronomy. data for a non-model crop plant species will allow for gene identification and comparisons between various species. The data has been sequenced using the Ilumina Hiseq 2500 sequencing platform. Data analysis is underway. In essence, this researc will eventually evaluate the potential use of Marama bean as a crop species to improve its value in agronomy.Keywords: 101 African orphan crops, RNA-Seq, Tylosema esculentum, underutilised crop plants
Procedia PDF Downloads 36023232 Sociocultural Foundations of Psychological Well-Being among Ethiopian Adults
Authors: Kassahun Tilahun
Abstract:
Most of the studies available on adult psychological well-being have been centered on Western countries. However, psychological well-being does not have the same meaning across the world. The Euro-American and African conceptions and experiences of psychological well-being differ systematically. As a result, questions like, how do people living in developing African countries, like Ethiopia, report their psychological well-being; what would the context-specific prominent determinants of their psychological well-being be, needs a definitive answer. This study was, therefore, aimed at developing a new theory that would address these socio-cultural issues of psychological well-being. Consequently, data were obtained through interview and open ended questionnaire. A total of 438 adults, working in governmental and non-governmental organizations situated in Addis Ababa, participated in the study. Appropriate qualitative method of data analysis, i.e. thematic content analysis, was employed for analyzing the data. The thematic analysis involves a type of abductive analysis, driven both by theoretical interest and the nature of the data. Reliability and credibility issues were addressed appropriately. The finding identified five major categories of themes, which are viewed as essential in determining the conceptions and experiences of psychological well-being of Ethiopian adults. These were; socio-cultural harmony, social cohesion, security, competence and accomplishment, and the self. Detailed discussion on the rational for including these themes was made and appropriate positive psychology interventions were proposed. Researchers are also encouraged to expand this qualitative research and in turn develop a suitable instrument taping the psychological well-being of adults with different sociocultural orientations.Keywords: sociocultural, psychological, well-being Ethiopia, adults
Procedia PDF Downloads 54623231 Thick Data Techniques for Identifying Abnormality in Video Frames for Wireless Capsule Endoscopy
Authors: Jinan Fiaidhi, Sabah Mohammed, Petros Zezos
Abstract:
Capsule endoscopy (CE) is an established noninvasive diagnostic modality in investigating small bowel disease. CE has a pivotal role in assessing patients with suspected bleeding or identifying evidence of active Crohn's disease in the small bowel. However, CE produces lengthy videos with at least eighty thousand frames, with a frequency rate of 2 frames per second. Gastroenterologists cannot dedicate 8 to 15 hours to reading the CE video frames to arrive at a diagnosis. This is why the issue of analyzing CE videos based on modern artificial intelligence techniques becomes a necessity. However, machine learning, including deep learning, has failed to report robust results because of the lack of large samples to train its neural nets. In this paper, we are describing a thick data approach that learns from a few anchor images. We are using sound datasets like KVASIR and CrohnIPI to filter candidate frames that include interesting anomalies in any CE video. We are identifying candidate frames based on feature extraction to provide representative measures of the anomaly, like the size of the anomaly and the color contrast compared to the image background, and later feed these features to a decision tree that can classify the candidate frames as having a condition like the Crohn's Disease. Our thick data approach reported accuracy of detecting Crohn's Disease based on the availability of ulcer areas at the candidate frames for KVASIR was 89.9% and for the CrohnIPI was 83.3%. We are continuing our research to fine-tune our approach by adding more thick data methods for enhancing diagnosis accuracy.Keywords: thick data analytics, capsule endoscopy, Crohn’s disease, siamese neural network, decision tree
Procedia PDF Downloads 15623230 Landsat Data from Pre Crop Season to Estimate the Area to Be Planted with Summer Crops
Authors: Valdir Moura, Raniele dos Anjos de Souza, Fernando Gomes de Souza, Jose Vagner da Silva, Jerry Adriani Johann
Abstract:
The estimate of the Area of Land to be planted with annual crops and its stratification by the municipality are important variables in crop forecast. Nowadays in Brazil, these information’s are obtained by the Brazilian Institute of Geography and Statistics (IBGE) and published under the report Assessment of the Agricultural Production. Due to the high cloud cover in the main crop growing season (October to March) it is difficult to acquire good orbital images. Thus, one alternative is to work with remote sensing data from dates before the crop growing season. This work presents the use of multitemporal Landsat data gathered on July and September (before the summer growing season) in order to estimate the area of land to be planted with summer crops in an area of São Paulo State, Brazil. Geographic Information Systems (GIS) and digital image processing techniques were applied for the treatment of the available data. Supervised and non-supervised classifications were used for data in digital number and reflectance formats and the multitemporal Normalized Difference Vegetation Index (NDVI) images. The objective was to discriminate the tracts with higher probability to become planted with summer crops. Classification accuracies were evaluated using a sampling system developed basically for this study region. The estimated areas were corrected using the error matrix derived from these evaluations. The classification techniques presented an excellent level according to the kappa index. The proportion of crops stratified by municipalities was derived by a field work during the crop growing season. These proportion coefficients were applied onto the area of land to be planted with summer crops (derived from Landsat data). Thus, it was possible to derive the area of each summer crop by the municipality. The discrepancies between official statistics and our results were attributed to the sampling and the stratification procedures. Nevertheless, this methodology can be improved in order to provide good crop area estimates using remote sensing data, despite the cloud cover during the growing season.Keywords: area intended for summer culture, estimated area planted, agriculture, Landsat, planting schedule
Procedia PDF Downloads 15023229 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction
Authors: Kefaya Qaddoum
Abstract:
Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.Keywords: tomato yield prediction, naive Bayes, redundancy, WSG
Procedia PDF Downloads 23723228 Effective Training System for Riding Posture Using Depth and Inertial Sensors
Authors: Sangseung Kang, Kyekyung Kim, Suyoung Chi
Abstract:
A good posture is the most important factor in riding. In this paper, we present an effective posture correction system for a riding simulator environment to provide position error detection and customized training functions. The proposed system detects and analyzes the rider's posture using depth data and inertial sensing data. Our experiments show that including these functions will help users improve their seat for a riding.Keywords: posture correction, posture training, riding posture, riding simulator
Procedia PDF Downloads 47623227 An Adaptive Oversampling Technique for Imbalanced Datasets
Authors: Shaukat Ali Shahee, Usha Ananthakumar
Abstract:
A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling
Procedia PDF Downloads 41823226 Assessment of Cardioprotective Effect of Deferiprone on Doxorubicin-Induced Cardiac Toxicity in a Rat Model
Authors: Sadaf Kalhori
Abstract:
Introduction: Doxorubicin (DOX)-induced cardiotoxicity is widely known as the most severe complication of anthracycline-based chemotherapy in patients with cancer. It is unknown whether Deferiprone (DFP), could reduce the severity of DOX-induced cardiotoxicity by inhibiting free radical reactions. Thus, this study was performed to assess the protective effect of Deferiprone on DOX-induced cardiotoxicity in a rat model. Methods: The rats were divided into five groups. Group one was a control group. Group 2 was DOX (2 mg/kg/day, every other day for 12 days), and Group three to five which receiving DOX as in group 2 and DFP 75,100 and 150 mg/kg/day, for 19 days, respectively. DFP was starting 5 days prior to the first DOX injection and two days after the last DOX injection throughout the study. Electrocardiographic and hemodynamic studies, along with histopathological examination, were conducted. In addition, serum sample was taken and total cholesterol, Malone dialdehyde, triglyceride, albumin, AST, ALT, total protein, lactate dehydrogenase, total anti-oxidant and creatine kinase were assessed. Result: Our results showed the normal structure of endocardial, myocardial and pericardial in the control group. Pathologic data such as edema, hyperemia, bleeding, endocarditis, myocarditis and pericarditis, hyaline degeneration, cardiomyocyte necrosis, myofilament degeneration and nuclear chromatin changes were assessed in all groups. In the DOX group, all pathologic data was seen with mean grade of 2±1.25. In the DFP group with a dose of 75 and 100 mg, the mean grade was 1.41± 0.31 and 1±.23, respectively. In DFP group with a dose of 150, the pathologic data showed a milder change in comparison with other groups with e mean grade of 0.45 ±0.19. Most pathologic data in DFP groups showed significant changes in comparison with the DOX group (p < 0.001). Discussion: The results also showed that DFP treatment significantly improved DOX-induced heart damage, structural changes in the myocardium, and ventricular function. Our data confirm that DFP is protective against cardiovascular-related disorders induced by DOX. Clinical studies are needed to be involved to examine these findings in humans.Keywords: cardiomyopathy, deferiprone, doxorubicin, rat
Procedia PDF Downloads 14223225 Assessment of Rainfall Erosivity, Comparison among Methods: Case of Kakheti, Georgia
Authors: Mariam Tsitsagi, Ana Berdzenishvili
Abstract:
Rainfall intensity change is one of the main indicators of climate change. It has a great influence on agriculture as one of the main factors causing soil erosion. Splash and sheet erosion are one of the most prevalence and harmful for agriculture. It is invisible for an eye at first stage, but the process will gradually move to stream cutting erosion. Our study provides the assessment of rainfall erosivity potential with the use of modern research methods in Kakheti region. The region is the major provider of wheat and wine in the country. Kakheti is located in the eastern part of Georgia and characterized quite a variety of natural conditions. The climate is dry subtropical. For assessment of the exact rate of rainfall erosion potential several year data of rainfall with short intervals are needed. Unfortunately, from 250 active metro stations running during the Soviet period only 55 of them are active now and 5 stations in Kakheti region respectively. Since 1936 we had data on rainfall intensity in this region, and rainfall erosive potential is assessed, in some old papers, but since 1990 we have no data about this factor, which in turn is a necessary parameter for determining the rainfall erosivity potential. On the other hand, researchers and local communities suppose that rainfall intensity has been changing and the number of haily days has also been increasing. However, finding a method that will allow us to determine rainfall erosivity potential as accurate as possible in Kakheti region is very important. The study period was divided into three sections: 1936-1963; 1963-1990 and 1990-2015. Rainfall erosivity potential was determined by the scientific literature and old meteorological stations’ data for the first two periods. And it is known that in eastern Georgia, at the boundary between steppe and forest zones, rainfall erosivity in 1963-1990 was 20-75% higher than that in 1936-1963. As for the third period (1990-2015), for which we do not have data of rainfall intensity. There are a variety of studies, where alternative ways of calculating the rainfall erosivity potential based on lack of data are discussed e.g.based on daily rainfall data, average annual rainfall data and the elevation of the area, etc. It should be noted that these methods give us a totally different results in case of different climatic conditions and sometimes huge errors in some cases. Three of the most common methods were selected for our research. Each of them was tested for the first two sections of the study period. According to the outcomes more suitable method for regional climatic conditions was selected, and after that, we determined rainfall erosivity potential for the third section of our study period with use of the most successful method. Outcome data like attribute tables and graphs was specially linked to the database of Kakheti, and appropriate thematic maps were created. The results allowed us to analyze the rainfall erosivity potential changes from 1936 to the present and make the future prospect. We have successfully implemented a method which can also be use for some another region of Georgia.Keywords: erosivity potential, Georgia, GIS, Kakheti, rainfall
Procedia PDF Downloads 22423224 Pragmatic Development of Chinese Sentence Final Particles via Computer-Mediated Communication
Authors: Qiong Li
Abstract:
This study investigated in which condition computer-mediated communication (CMC) could promote pragmatic development. The focal feature included four Chinese sentence final particles (SFPs), a, ya, ba, and ne. They occur frequently in Chinese, and function as mitigators to soften the tone of speech. However, L2 acquisition of SFPs is difficult, suggesting the necessity of additional exposure to or explicit instruction on Chinese SFPs. This study follows this line and aims to explore two research questions: (1) Is CMC combined with data-driven instruction more effective than CMC alone in promoting L2 Chinese learners’ SFP use? (2) How does L2 Chinese learners’ SFP use change over time, as compared to the production of native Chinese speakers? The study involved 19 intermediate-level learners of Chinese enrolled at a private American university. They were randomly assigned to two groups: (1) the control group (N = 10), which was exposed to SFPs through CMC alone, (2) the treatment group (N = 9), which was exposed to SFPs via CMC and data-driven instruction. Learners interacted with native speakers on given topics through text-based CMC over Skype. Both groups went through six 30-minute CMC sessions on a weekly basis, with a one-week interval after the first two CMC sessions and a two-week interval after the second two CMC sessions (nine weeks in total). The treatment group additionally received a data-driven instruction after the first two sessions. Data analysis focused on three indices: token frequency, type frequency, and acceptability of SFP use. Token frequency was operationalized as the raw occurrence of SFPs per clause. Type frequency was the range of SFPs. Acceptability was rated by two native speakers using a rating rubric. The results showed that the treatment group made noticeable progress over time on the three indices. The production of SFPs approximated the native-like level. In contrast, the control group only slightly improved on token frequency. Only certain SFPs (a and ya) reached the native-like use. Potential explanations for the group differences were discussed in two aspects: the property of Chinese SFPs and the role of CMC and data-driven instruction. Though CMC provided the learners with opportunities to notice and observe SFP use, as a feature with low saliency, SFPs were not easily noticed in input. Data-driven instruction in the treatment group directed the learners’ attention to these particles, which facilitated the development.Keywords: computer-mediated communication, data-driven instruction, pragmatic development, second language Chinese, sentence final particles
Procedia PDF Downloads 41823223 Forecasting Cancers Cases in Algeria Using Double Exponential Smoothing Method
Authors: Messis A., Adjebli A., Ayeche R., Talbi M., Tighilet K., Louardiane M.
Abstract:
Cancers are the second cause of death worldwide. Prevalence and incidence of cancers is getting increased by aging and population growth. This study aims to predict and modeling the evolution of breast, Colorectal, Lung, Bladder and Prostate cancers over the period of 2014-2019. In this study, data were analyzed using time series analysis with double exponential smoothing method to forecast the future pattern. To describe and fit the appropriate models, Minitab statistical software version 17 was used. Between 2014 and 2019, the overall trend in the raw number of new cancer cases registered has been increasing over time; the change in observations over time has been increasing. Our forecast model is validated since we have good prediction for the period 2020 and data not available for 2021 and 2022. Time series analysis showed that the double exponential smoothing is an efficient tool to model the future data on the raw number of new cancer cases.Keywords: cancer, time series, prediction, double exponential smoothing
Procedia PDF Downloads 8923222 Mutual Information Based Image Registration of Satellite Images Using PSO-GA Hybrid Algorithm
Authors: Dipti Patra, Guguloth Uma, Smita Pradhan
Abstract:
Registration is a fundamental task in image processing. It is used to transform different sets of data into one coordinate system, where data are acquired from different times, different viewing angles, and/or different sensors. The registration geometrically aligns two images (the reference and target images). Registration techniques are used in satellite images and it is important in order to be able to compare or integrate the data obtained from these different measurements. In this work, mutual information is considered as a similarity metric for registration of satellite images. The transformation is assumed to be a rigid transformation. An attempt has been made here to optimize the transformation function. The proposed image registration technique hybrid PSO-GA incorporates the notion of Particle Swarm Optimization and Genetic Algorithm and is used for finding the best optimum values of transformation parameters. The performance comparision obtained with the experiments on satellite images found that the proposed hybrid PSO-GA algorithm outperforms the other algorithms in terms of mutual information and registration accuracy.Keywords: image registration, genetic algorithm, particle swarm optimization, hybrid PSO-GA algorithm and mutual information
Procedia PDF Downloads 40823221 Social Movements of Central-Eastern Europe: Examining Trends of Cooperation and Antagonism by Using Big Data
Authors: Reka Zsuzsanna Mathe
Abstract:
The globalization and the Europeanization have significantly contributed to a change in the role of the nation-states. The global economic crisis, the climate changes, and the recent refugee crisis, are just a few among many challenges that cannot be effectively addressed by the traditional role of the nation-states. One of the main roles of the states is to solve collective action problems, however due to their changing roles; apparently this is getting more and more difficult. Depending on political culture, collective action problems are solved either through cooperation or conflict. The political culture of Central and Eastern European (CEE) countries is marked by low civic participation and by a weak civil society. In this type of culture collective action problems are likely to be induced through conflict, rather than the democratic process of dialogue and any type of social change is probably to be introduced by social movements. Several studies have been conducted on the social movements of the CEE countries, yet, it is still not clear if the most significant social movements of the region tend to choose rather the cooperative or the conflictual way as action strategy. This study differentiates between a national and a European action field, having different social orders. The actors of the two fields are the broadly understood civil society members, conceptualized as social movements. This research tries to answer the following questions: a) What are the norms that best characterize the CEE countries’ social order? b) What type of actors would prefer a change and in which areas? c) Is there a significant difference between the main actors active in the national versus the European field? The main hypotheses are that there are conflicting norms defining the national and the European action field, and there is a significant difference between the action strategies adopted by social movements acting in the two different fields. In mapping the social order, the study uses data provided by the European Social Survey. Big data of the Global Data on Events, Location and Tone (GDELT) database offers information regarding the main social movements and their preferred type of action. The unit of the analysis is the so called ‘Visegrad 4’ countries: Poland, Czech Republic, Slovakia and Hungary and the research uses data starting from 2005 (after the European accession of these four countries) until May, 2017. According to the data, the main hypotheses were confirmed.Keywords: big data, Central and Eastern Europe, civil society, GDELT, social movements
Procedia PDF Downloads 16123220 Vascular Foramina of the Capitate Bone of the Hand – an Anatomical Study
Authors: Latha V. Prabhu, B.V. Murlimanju, P.J. Jiji, Mangala M. Pai
Abstract:
Background: The capitate is the largest among the carpal bones. There exists no literature about the vascular foramina of the capitate bone. The objective of the present study was to investigate the morphology and number of the nutrient foramina in the cadaveric dried capitate bones of the Indian population. Methods: The present study included 59 capitate bones (25 right sided and 34 left sided) which were obtained from the gross anatomy laboratory of our institution. The bones were macroscopically observed for the nutrient foramina and the data was collected with respect to their number. The tabulation of the data and analysis were done. Results: All of our specimens (100%) exhibited the nutrient foramina over the non-articular and articular surfaces. The foramina were observed at the medial, lateral, palmar and dorsal surfaces of the capitate bones. The foramina were ranged from 6 to 23 in each capitate bone. In the medial surface, the foramina ranged from 1 to 6, lateral surface from 0 to 7, the foramina ranged between 0 and 5 in the palmar surface. However most of the foramina were located at the dorsal surface which ranged from 3 to 11. Conclusion: We believe that the present study has provided additional data about the nutrient foramina of the capitate bones. The data is enlightening to the orthopedic surgeon and would help in the hand surgeries. The knowledge about the foramina is also important to the radiologists to prevent the misinterpretation of the findings in the x ray and computed tomogram scan films. The foramina may mimick like erosions and ossicles. The morphological knowledge of the vasculature, their foramina of entry and number is required to understand the concepts in the avascular necrosis of the capitate.Keywords: avascular necrosis, capitate, morphology, nutrient foramen
Procedia PDF Downloads 34423219 Development and Validation of a Semi-Quantitative Food Frequency Questionnaire for Use in Urban and Rural Communities of Rwanda
Authors: Phenias Nsabimana, Jérôme W. Some, Hilda Vasanthakaalam, Stefaan De Henauw, Souheila Abbeddou
Abstract:
Tools for the dietary assessment in adults are limited in low- and middle-income settings. The objective of this study was to develop and validate a semi-quantitative food frequency questionnaire (FFQ) against the multiple pass-24 h recall tool for use in urban and rural Rwanda. A total of 212 adults (154 females and 58 males), 18-49 aged, including 105 urban and 107 rural residents, from the four regions of Rwanda, were recruited in the present study. A multiple-pass 24- H recall technique was used to collect dietary data in both urban and rural areas in four different rounds, on different days (one weekday and one weekend day), separated by a period of three months, from November 2020 to October 2021. The details of all the foods and beverages consumed over the 24h period of the day prior to the interview day were collected during face-to-face interviews. A list of foods, beverages, and commonly consumed recipes was developed by the study researchers and ten research assistants from the different regions of Rwanda. Non-standard recipes were collected when the information was available. A single semi-quantitative FFQ was also developed in the same group discussion prior to the beginning of the data collection. The FFQ was collected at the beginning and the end of the data collection period. Data were collected digitally. The amount of energy and macro-nutrients contributed by each food, recipe, and beverage will be computed based on nutrient composition reported in food composition tables and weight consumed. Median energy and nutrient contents of different food intakes from FFQ and 24-hour recalls and median differences (24-hour recall –FFQ) will be calculated. Kappa, Spearman, Wilcoxon, and Bland-Altman plot statistics will be conducted to evaluate the correlation between estimated nutrient and energy intake found by the two methods. Differences will be tested for their significance and all analyses will be done with STATA 11. Data collection was completed in November 2021. Data cleaning is ongoing and the data analysis is expected to be completed by July 2022. A developed and validated semi-quantitative FFQ will be available for use in dietary assessment. The developed FFQ will help researchers to collect reliable data that will support policy makers to plan for proper dietary change intervention in Rwanda.Keywords: food frequency questionnaire, reproducibility, 24-H recall questionnaire, validation
Procedia PDF Downloads 14123218 A Study of Variables Affecting on a Quality Assessment of Mathematics Subject in Thailand by Using Value Added Analysis on TIMSS 2011
Authors: Ruangdech Sirikit
Abstract:
The purposes of this research were to study the variables affecting the quality assessment of mathematics subject in Thailand by using value-added analysis on TIMSS 2011. The data used in this research is the secondary data from the 2011 Trends in International Mathematics and Science Study (TIMSS), collected from 6,124 students in 172 schools from Thailand, studying only mathematics subjects. The data were based on 14 assessment tests of knowledge in mathematics. There were 3 steps of data analysis: 1) To analyze descriptive statistics 2) To estimate competency of students from the assessment of their mathematics proficiency by using MULTILOG program; 3) analyze value added in the model of quality assessment using Value-Added Model with Hierarchical Linear Modeling (HLM) and 2 levels of analysis. The research results were as follows: 1. Student level variables that had significant effects on the competency of students at .01 levels were Parental care, Resources at home, Enjoyment of learning mathematics and Extrinsic motivation in learning mathematics. Variable that had significant effects on the competency of students at .05 levels were Education of parents and self-confident in learning mathematics. 2. School level variable that had significant effects on competency of students at .01 levels was Extra large school. Variable that had significant effects on competency of students at .05 levels was medium school.Keywords: quality assessment, value-added model, TIMSS, mathematics, Thailand
Procedia PDF Downloads 28323217 Modeling Average Paths Traveled by Ferry Vessels Using AIS Data
Authors: Devin Simmons
Abstract:
At the USDOT’s Bureau of Transportation Statistics, a biannual census of ferry operators in the U.S. is conducted, with results such as route mileage used to determine federal funding levels for operators. AIS data allows for the possibility of using GIS software and geographical methods to confirm operator-reported mileage for individual ferry routes. As part of the USDOT’s work on the ferry census, an algorithm was developed that uses AIS data for ferry vessels in conjunction with known ferry terminal locations to model the average route travelled for use as both a cartographic product and confirmation of operator-reported mileage. AIS data from each vessel is first analyzed to determine individual journeys based on the vessel’s velocity, and changes in velocity over time. These trips are then converted to geographic linestring objects. Using the terminal locations, the algorithm then determines whether the trip represented a known ferry route. Given a large enough dataset, routes will be represented by multiple trip linestrings, which are then filtered by DBSCAN spatial clustering to remove outliers. Finally, these remaining trips are ready to be averaged into one route. The algorithm interpolates the point on each trip linestring that represents the start point. From these start points, a centroid is calculated, and the first point of the average route is determined. Each trip is interpolated again to find the point that represents one percent of the journey’s completion, and the centroid of those points is used as the next point in the average route, and so on until 100 points have been calculated. Routes created using this algorithm have shown demonstrable improvement over previous methods, which included the implementation of a LOESS model. Additionally, the algorithm greatly reduces the amount of manual digitizing needed to visualize ferry activity.Keywords: ferry vessels, transportation, modeling, AIS data
Procedia PDF Downloads 17623216 Power Transformer Risk-Based Maintenance by Optimization of Transformer Condition and Transformer Importance
Authors: Kitti Leangkrua
Abstract:
This paper presents a risk-based maintenance strategy of a power transformer in order to optimize operating and maintenance costs. The methodology involves the study and preparation of a database for the collection the technical data and test data of a power transformer. An evaluation of the overall condition of each transformer is performed by a program developed as a result of the measured results; in addition, the calculation of the main equipment separation to the overall condition of the transformer (% HI) and the criteria for evaluating the importance (% ImI) of each location where the transformer is installed. The condition assessment is performed by analysis test data such as electrical test, insulating oil test and visual inspection. The condition of the power transformer will be classified from very poor to very good condition. The importance is evaluated from load criticality, importance of load and failure consequence. The risk matrix is developed for evaluating the risk of each power transformer. The high risk power transformer will be focused firstly. The computerized program is developed for practical use, and the maintenance strategy of a power transformer can be effectively managed.Keywords: asset management, risk-based maintenance, power transformer, health index
Procedia PDF Downloads 30623215 Transition Pay vs. Liquidity Holdings: A Comparative Analysis on Consumption Smoothing using Bank Transaction Data
Authors: Nora Neuteboom
Abstract:
This study investigates household financial behaviors during unemployment spells in the Netherlands using high-frequency transaction data through a event study specification integrating propensity score matching. In our specification, we contrasted treated individuals, who underwent job loss, with non-treated individuals possessing comparable financial characteristics. The initial onset of unemployment triggers a substantial surge in income, primarily attributed to transition payments, but swiftly drops post-unemployment, with unemployment benefits covering slightly over half of former salary earnings. Despite a re-employment rate of around half within six months, the treatment group experiences a persistent average monthly earnings reduction of approximately 600 EUR by month. Spending patterns fluctuate significantly, surging before unemployment due to transition payments and declining below non-treated individuals post-unemployment, indicating challenges to fully smooth consumption after job loss. Furthermore, our study disentangles the effects of transition payments and liquidity holdings on spending, revealing that transition payments exert a more pronounced and prolonged impact on consumption smoothing than liquidity holdings. Transition payments significantly stimulate spending, particularly in pin and iDEAL categories, contrasting a much smaller relative spending impact of liquidity holdings.Keywords: household consumption, transaction data, big data, propensity score matching
Procedia PDF Downloads 2123214 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data
Authors: Michelangelo Sofo, Giuseppe Labianca
Abstract:
In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm
Procedia PDF Downloads 24