Search results for: data%20center
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25104

Search results for: data%20center

23124 Educational Leadership and Artificial Intelligence

Authors: Sultan Ghaleb Aldaihani

Abstract:

- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.

Keywords: Education, Leadership, Technology, Artificial Intelligence

Procedia PDF Downloads 41
23123 Identification of CLV for Online Shoppers Using RFM Matrix: A Case Based on Features of B2C Architecture

Authors: Riktesh Srivastava

Abstract:

Online Shopping have established an astonishing evolution in the last few years. And it is now apparent that B2C architecture is becoming progressively imperative channel for even traditional brick and mortar type traders as well. In this completion knowing customers and predicting behavior are extremely important. More important, when any customer logs onto the B2C architecture, the traces of their buying patterns can be stored and used for future predictions. Such a prediction is called Customer Lifetime Value (CLV). Earlier, we used Net Present Value to do so, however, it ignores two important aspects of B2C architecture, “market risks” and “big amount of customer data”. Now, we use RFM- Recency, Frequency and Monetary Value to estimate the CLV, and as the term exemplifies, market risks, is well sheltered. Big Data Analysis is also roofed in RFM, which gives real exploration of the Big Data and lead to a better estimation for future cash flow from customers. In the present paper, 6 factors (collected from varied sources) are used to determine as to what attracts the customers to the B2C architecture. For these 6 factors, RFM is computed for 3 years (2013, 2014 and 2015) respectively. CLV and Revenue are the two parameters defined using RFM analysis, which gives the clear picture of the future predictions.

Keywords: CLV, RFM, revenue, recency, frequency, monetary value

Procedia PDF Downloads 219
23122 Towards a Quantification of the Wind Erosion of the Gharb Shoreline Soils in Morocco by the Application of a Mathematical Model

Authors: Mohammed Kachtali, Imad Fenjiro, Jamal Alkarkouri

Abstract:

Wind erosion is a serious environmental problem in arid and semi-arid regions. Indeed, wind erosion easily removes the finest particles of the soil surface, which also contribute to losing soil fertility. The siltation of infrastructures and cultivated areas and the negative impact on health are additional consequences of wind erosion. In Morocco, wind erosion constitutes the main factor of silting up in coast and Sahara. The aim of our study is to use an equation of wind erosion in order to estimate the soil loses by wind erosion in the coast of Gharb (North of Morocco). The used equation in our model includes the geographic data, climatic data of 30 years and edaphic data collected from area study which contained 11 crossing of 4 stations. Our results have shown that the values of wind erosion are higher and very different between some crossings (p < 0.001). This difference is explained by topography, soil texture, and climate. In conclusion, wind erosion is higher in Gharb coast and varies from station to another; this problem required several methods of control and mitigation.

Keywords: Gharb coast, modeling, silting, wind erosion

Procedia PDF Downloads 134
23121 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 363
23120 Unearthing Air Traffic Control Officers Decision Instructional Patterns From Simulator Data for Application in Human Machine Teams

Authors: Zainuddin Zakaria, Sun Woh Lye

Abstract:

Despite the continuous advancements in automated conflict resolution tools, there is still a low rate of adoption of automation from Air Traffic Control Officers (ATCOs). Trust or acceptance in these tools and conformance to the individual ATCO preferences in strategy execution for conflict resolution are two key factors that impact their use. This paper proposes a methodology to unearth and classify ATCO conflict resolution strategies from simulator data of trained and qualified ATCOs. The methodology involves the extraction of ATCO executive control actions and the establishment of a system of strategy resolution classification based on ATCO radar commands and prevailing flight parameters in deconflicting a pair of aircraft. Six main strategies used to handle various categories of conflict were identified and discussed. It was found that ATCOs were about twice more likely to choose only vertical maneuvers in conflict resolution compared to horizontal maneuvers or a combination of both vertical and horizontal maneuvers.

Keywords: air traffic control strategies, conflict resolution, simulator data, strategy classification system

Procedia PDF Downloads 147
23119 Spectral Re-Evaluation of the Magnetic Basement Depth over Yola Arm of Upper Benue Trough Nigeria Using Aeromagnetic Data

Authors: Emberga Terhemb Opara Alexander, Selemo Alexader, Onyekwuru Samuel

Abstract:

The aeromagnetic data have been used to re-evaluate parts of the Upper Benue Trough Nigeria using spectral analysis technique in order to appraise the mineral accumulation potential of the area. The regional field was separated with a first order polynomial using polyfit program. The residual data was subdivided into 24 spectral blocks using OASIS MONTAJ software program. Two prominent magnetic depth source layers were identified. The deeper source depth values obtained ranges from 1.56km to 2.92km with an average depth of 2.37km as the magnetic basement depth while for the shallower sources, the depth values ranges from -1.17km to 0.98km with an average depth of 0.55km. The shallow depth source is attributed to the volcanic rocks that intruded the sedimentary formation and this could possibly be responsible for the mineralization found in parts of the study area.

Keywords: spectral analysis, Upper Benue Trough, magnetic basement depth, aeromagnetic

Procedia PDF Downloads 448
23118 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 430
23117 Effect of Packaging Treatment and Storage Condition on Stability of Low Fat Chicken Burger

Authors: Mohamed Ahmed Kenawi Abdallah

Abstract:

Chemical composition, cooking loss, shrinkage value, texture coefficient indices, Feder value, microbial examination, and sensory evaluation were done in order to examine the effect of adding 15% germinated quinoa seeds flour as extender to chicken wings meat to produce low fat chicken burger, packaged in two different packing materials and stored frozen for nine months. The data indicated reduction in the moisture content, crude either extract, and increase in the ash content, pH value, and total acidity for the samples extended by quinoa flour compared with the control one. The data showed that the extended samples with quinoa flour had the lowest values of TBA, cooking loss, and shrinkage value compared with the control ones. The data also revealed that, the sample contained quinoa flour had total bacterial count and psychrophilic bacterial count lower than the control sample. In addition, it has higher evaluation values for overall acceptability than the control one.

Keywords: chicken wings, low fat chicken burger, quinoa flour, vacuum packaging.

Procedia PDF Downloads 101
23116 An Evaluation Method of Accelerated Storage Life Test for Typical Mechanical and Electronic Products

Authors: Jinyong Yao, Hongzhi Li, Chao Du, Jiao Li

Abstract:

Reliability of long-term storage products is related to the availability of the whole system, and the evaluation of storage life is of great necessity. These products are usually highly reliable and little failure information can be collected. In this paper, an analytical method based on data from accelerated storage life test is proposed to evaluate the reliability index of the long-term storage products. Firstly, singularities are eliminated by data normalization and residual analysis. Secondly, with the pre-processed data, the degradation path model is built to obtain the pseudo life values. Then by life distribution hypothesis, we can get the estimator of parameters in high stress levels and verify failure mechanisms consistency. Finally, the life distribution under the normal stress level is extrapolated via the acceleration model and evaluation of the true average life available. An application example with the camera stabilization device is provided to illustrate the methodology we proposed.

Keywords: accelerated storage life test, failure mechanisms consistency, life distribution, reliability

Procedia PDF Downloads 388
23115 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application

Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro

Abstract:

This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.

Keywords: item response theory, dimensionality, submodel theory, factorial analysis

Procedia PDF Downloads 371
23114 Storage System Validation Study for Raw Cocoa Beans Using Minitab® 17 and R (R-3.3.1)

Authors: Anthony Oppong Kyekyeku, Sussana Antwi-Boasiako, Emmanuel De-Graft Johnson Owusu Ansah

Abstract:

In this observational study, the performance of a known conventional storage system was tested and evaluated for fitness for its intended purpose. The system has a scope extended for the storage of dry cocoa beans. System sensitivity, reproducibility and uncertainties are not known in details. This study discusses the system performance in the context of existing literature on factors that influence the quality of cocoa beans during storage. Controlled conditions were defined precisely for the system to give reliable base line within specific established procedures. Minitab® 17 and R statistical software (R-3.3.1) were used for the statistical analyses. The approach to the storage system testing was to observe and compare through laboratory test methods the quality of the cocoa beans samples before and after storage. The samples were kept in Kilner jars and the temperature of the storage environment controlled and monitored over a period of 408 days. Standard test methods use in international trade of cocoa such as the cut test analysis, moisture determination with Aqua boy KAM III model and bean count determination were used for quality assessment. The data analysis assumed the entire population as a sample in order to establish a reliable baseline to the data collected. The study concluded a statistically significant mean value at 95% Confidence Interval (CI) for the performance data analysed before and after storage for all variables observed. Correlational graphs showed a strong positive correlation for all variables investigated with the exception of All Other Defect (AOD). The weak relationship between the before and after data for AOD had an explained variability of 51.8% with the unexplained variability attributable to the uncontrolled condition of hidden infestation before storage. The current study concluded with a high-performance criterion for the storage system.

Keywords: benchmarking performance data, cocoa beans, hidden infestation, storage system validation

Procedia PDF Downloads 173
23113 Disaggregation of Coarser Resolution Radiometer Derived Soil Moisture to Finer Scales

Authors: Gurjeet Singh, Rabindra K. Panda

Abstract:

Soil moisture is a key hydrologic state variable and is intrinsically linked to the Earth's water, climate and carbon cycles. On ecological point of view, the soil moisture is a fundamental natural resource providing the transpirable water for plants. Soil moisture varies both temporally and spatially due to spatiotemporal variation in rainfall, vegetation cover, soil properties and topography. Satellite derived soil moisture provides spatio-temporal extensive data. However, the spatial resolution of a typical satellite (L-band radiometry) is of the order of tens of kilometers, which is not good enough for developing efficient agricultural water management schemes at the field scale. In the present study, the soil moisture from radiometer data has been disaggregated using blending approach to achieve higher resolution soil moisture data. The radiometer estimates of soil moisture at a 40 km resolution have been disaggregated to 10 km, 5 km and 1 km resolutions. The disaggregated soil moisture was compared with the observed data, consisting of continuous sensor based soil moisture profile measurements, at three monitoring sites and extensive spatial near-surface soil moisture measurements, concurrent with satellite monitoring in the 500 km2 study watershed in the Eastern India. The estimated soil moisture status at different spatial scales can help in developing efficient agricultural water management schemes to increase the crop production and water use efficiency.

Keywords: disaggregation, eastern India, radiometers, soil moisture, water use efficiency

Procedia PDF Downloads 275
23112 Analyzing Current Transformer’s Transient and Steady State Behavior for Different Burden’s Using LabVIEW Data Acquisition Tool

Authors: D. Subedi, D. Sharma

Abstract:

Current transformers (CTs) are used to transform large primary currents to a small secondary current. Since most standard equipment’s are not designed to handle large primary currents the CTs have an important part in any electrical system for the purpose of Metering and Protection both of which are integral in Power system. Now a days due to advancement in solid state technology, the operation times of the protective relays have come to a few cycles from few seconds. Thus, in such a scenario it becomes important to study the transient response of the current transformers as it will play a vital role in the operating of the protective devices. This paper shows the steady state and transient behavior of current transformers and how it changes with change in connected burden. The transient and steady state response will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer characteristics with changes in burden will be discussed.

Keywords: accuracy, accuracy limiting factor, burden, current transformer, instrument security factor

Procedia PDF Downloads 341
23111 Enframing the Smart City: Utilizing Heidegger's 'The Question Concerning Technology' as a Framework to Interpret Smart Urbanism

Authors: Will Brown

Abstract:

Martin Heidegger is considered to be one of the leading philosophical lights of the 20th century with his lecture/essay 'The Question Concerning Technology' proving to be an invaluable text in the study of technology and the understanding of how technology influences the world it is set upon. However, this text has not as of yet been applied to the rapid rise and proliferation of ‘smart’ cities. This article is premised upon the application of the aforementioned text and the smart city in order to provide a fresh, if not critical analysis and interpretation of this phenomena. The first section below provides a brief literature review of smart urbanism in order to lay the groundwork necessary to apply Heidegger’s work to the smart city, from which a framework is developed to interpret the infusion of digital sensing technologies and the urban milieu. This framework is comprised of four concepts put forward in Heidegger’s text: circumscribing, bringing-forth, challenging, and standing-reserve. A concluding chapter is based upon the notion of enframement, arguing that once the rubric of data collection is placed within the urban system, future systems will require the capability to harvest data, resulting in an ever-renewing smart city.

Keywords: air quality sensing, big data, Martin Heidegger, smart city

Procedia PDF Downloads 206
23110 Health Perceptions in Elderly Population, before and after COVID-19

Authors: María José López Rey, Mar Chaves Carrillo, Manuela Caballero Guisado

Abstract:

The data presented here are part of a broader investigation on active population aging. The work was carried out in November 2020 in Extremadura, a region of southern Spain. This R + D + I project, called "Active aging scenarios in Extremadura: intervention proposals," was carried out by a team of professors, researchers from the University of Extremadura. The project has been financed by the European Regional Development Funds and the Government of Extremadura. Here, we focus on aspects that have to do with the experience of health, especially during the COVID-19 pandemic, and how this has affected the population related to the main sociodemographic variables. In an exercise of methodological triangulation, thus providing robustness to the analysis, primary data, obtained from the survey designed ad hoc, are combined with other secondary data from various sources and studies carried out in Spain (Sociological Research Centre, and National Institute of Statistics). The survey was carried out on a representative sample of the population over 55 years old, coming from Extremadura. Among the findings, we must highlight the practical invariability of perceptions based on the main sociodemographic variables, as well as some differences indicated by the variables sex and age.

Keywords: aging, health, COVID-19, perceptions

Procedia PDF Downloads 187
23109 Document-level Sentiment Analysis: An Exploratory Case Study of Low-resource Language Urdu

Authors: Ammarah Irum, Muhammad Ali Tahir

Abstract:

Document-level sentiment analysis in Urdu is a challenging Natural Language Processing (NLP) task due to the difficulty of working with lengthy texts in a language with constrained resources. Deep learning models, which are complex neural network architectures, are well-suited to text-based applications in addition to data formats like audio, image, and video. To investigate the potential of deep learning for Urdu sentiment analysis, we implemented five different deep learning models, including Bidirectional Long Short Term Memory (BiLSTM), Convolutional Neural Network (CNN), Convolutional Neural Network with Bidirectional Long Short Term Memory (CNN-BiLSTM), and Bidirectional Encoder Representation from Transformer (BERT). In this study, we developed a hybrid deep learning model called BiLSTM-Single Layer Multi Filter Convolutional Neural Network (BiLSTM-SLMFCNN) by fusing BiLSTM and CNN architecture. The proposed and baseline techniques are applied on Urdu Customer Support data set and IMDB Urdu movie review data set by using pre-trained Urdu word embedding that are suitable for sentiment analysis at the document level. Results of these techniques are evaluated and our proposed model outperforms all other deep learning techniques for Urdu sentiment analysis. BiLSTM-SLMFCNN outperformed the baseline deep learning models and achieved 83%, 79%, 83% and 94% accuracy on small, medium and large sized IMDB Urdu movie review data set and Urdu Customer Support data set respectively.

Keywords: urdu sentiment analysis, deep learning, natural language processing, opinion mining, low-resource language

Procedia PDF Downloads 70
23108 Interior Design: Changing Values

Authors: Kika Ioannou Kazamia

Abstract:

This paper examines the action research cycle of the second phase of longitudinal research on sustainable interior design practices, between two groups of stakeholders, designers and clients. During this phase of the action research, the second step - the change stage - of Lewin’s change management model has been utilized to change values, approaches, and attitudes toward sustainable design practices among the participants. Affective domain learning theory is utilized to attach new values. Learning with the use of information technology, collaborative learning, and problem-based learning are the learning methods implemented toward the acquisition of the objectives. Learning methods, and aims, require the design of interventions with participants' involvement in activities that would lead to the acknowledgment of the benefits of sustainable practices. Interventions are steered to measure participants’ decisions for the worth and relevance of ideas, and experiences; accept or commit to a particular stance or action. The data collection methods used in this action research are observers’ reports, participants' questionnaires, and interviews. The data analyses use both quantitative and qualitative methods. The main beneficial aspect of the quantitative method was to provide the means to separate many factors that obscured the main qualitative findings. The qualitative method allowed data to be categorized, to adapt the deductive approach, and then examine for commonalities that could reflect relevant categories or themes. The results from the data indicate that during the second phase, designers and clients' participants altered their behaviours.

Keywords: design, change, sustainability, learning, practices

Procedia PDF Downloads 76
23107 Understanding Tacit Knowledge and DIKW

Authors: Bahadir Aydin

Abstract:

Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.

Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW

Procedia PDF Downloads 517
23106 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms

Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang

Abstract:

Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.

Keywords: bioassay, machine learning, preprocessing, virtual screen

Procedia PDF Downloads 273
23105 Determinants of Foreign Direct Investment in Tourism: A Panel Data Analysis of Developing Countries

Authors: Malraj Bharatha Kiriella

Abstract:

The purpose of this paper is to investigate the determinants of tourism foreign direct investment (TFDI) to selected developing countries during 1978-2017. The study used pooled panel data to estimate an econometric model. The findings show that market size and institutional barriers are determining factors for TFDI in countries, while other variables of positive country conditions, FDI-related government policy, tourism-related infrastructure and labor conditions are insignificant. The result shows that institutional effects are positive, while market size negatively affects TFDI inflows. The research is limited to eight developing countries. The results can be used to support government policy on TFDI. The paper makes the following contributions: First, it provides important insight and understanding into the TFDI decision-making process in developing countries. Second, both TFDI theory and evidence are minimal, and an econometric model developed on the basis of available literature has been empirically tested.

Keywords: determinants, developing countries, FDI in tourism, panel data

Procedia PDF Downloads 106
23104 Systematic NIR of Internal Disorder and Quality Detection of Apple Fruit

Authors: Eid Alharbi, Yaser Miaji, Saeed Alzahrani

Abstract:

The importance of fruit quality and freshness is potential in today’s life. Most recent studies show and automatic online sorting system according to the internal disorder for fresh apple fruit has developed by using near infrared (NIR) spectroscopic technology. The automatic convener belts system along with sorting mechanism was constructed. To check the internal quality of the apple fruit, apple was exposed to the NIR radiations in the range 650-1300 nm and the data were collected in form of absorption spectra. The collected data were compared to the reference (data of known sample) analyzed and an electronic signal was pass to the sorting system. The sorting system was separate the apple fruit samples according to electronic signal passed to the system. It is found that absorption of NIR radiation in the range 930-950 nm was higher in the internally defected samples as compared to healthy samples. On the base of this high absorption of NIR radiation in 930-950 nm region the online sorting system was constructed.

Keywords: mechatronics design, NIR, fruit quality, spectroscopic technology

Procedia PDF Downloads 494
23103 The Accuracy of Parkinson's Disease Diagnosis Using [123I]-FP-CIT Brain SPECT Data with Machine Learning Techniques: A Survey

Authors: Lavanya Madhuri Bollipo, K. V. Kadambari

Abstract:

Objective: To discuss key issues in the diagnosis of Parkinson disease (PD), To discuss features influencing PD progression, To discuss importance of brain SPECT data in PD diagnosis, and To discuss the essentiality of machine learning techniques in early diagnosis of PD. An accurate and early diagnosis of PD is nowadays a challenge as clinical symptoms in PD arise only when there is more than 60% loss of dopaminergic neurons. So far there are no laboratory tests for the diagnosis of PD, causing a high rate of misdiagnosis especially when the disease is in the early stages. Recent neuroimaging studies with brain SPECT using 123I-Ioflupane (DaTSCAN) as radiotracer shown to be widely used to assist the diagnosis of PD even in its early stages. Machine learning techniques can be used in combination with image analysis procedures to develop computer-aided diagnosis (CAD) systems for PD. This paper addressed recent studies involving diagnosis of PD in its early stages using brain SPECT data with Machine Learning Techniques.

Keywords: Parkinson disease (PD), dopamine transporter, single-photon emission computed tomography (SPECT), support vector machine (SVM)

Procedia PDF Downloads 397
23102 Secure Network Coding against Content Pollution Attacks in Named Data Network

Authors: Tao Feng, Xiaomei Ma, Xian Guo, Jing Wang

Abstract:

Named Data Network (NDN) is one of the future Internet architecture, all nodes (i.e., hosts, routers) are allowed to have a local cache, used to satisfy incoming requests for content. However, depending on caching allows an adversary to perform attacks that are very effective and relatively easy to implement, such as content pollution attack. In this paper, we use a method of secure network coding based on homomorphic signature system to solve this problem. Firstly ,we use a dynamic public key technique, our scheme for each generation authentication without updating the initial secret key used. Secondly, employing the homomorphism of hash function, intermediate node and destination node verify the signature of the received message. In addition, when the network topology of NDN is simple and fixed, the code coefficients in our scheme are generated in a pseudorandom number generator in each node, so the distribution of the coefficients is also avoided. In short, our scheme not only can efficiently prevent against Intra/Inter-GPAs, but also can against the content poisoning attack in NDN.

Keywords: named data networking, content polloution attack, network coding signature, internet architecture

Procedia PDF Downloads 336
23101 Investigating Seasonal Changes of Urban Land Cover with High Spatio-Temporal Resolution Satellite Data via Image Fusion

Authors: Hantian Wu, Bo Huang, Yuan Zeng

Abstract:

Divisions between wealthy and poor, private and public landscapes are propagated by the increasing economic inequality of cities. While these are the spatial reflections of larger social issues and problems, urban design can at least employ spatial techniques that promote more inclusive rather than exclusive, overlapping rather than segregated, interlinked rather than disconnected landscapes. Indeed, the type of edge or border between urban landscapes plays a critical role in the way the environment is perceived. China experiences rapid urbanization, which poses unpredictable environmental challenges. The urban green cover and water body are under changes, which highly relevant to resident wealth and happiness. However, very limited knowledge and data on their rapid changes are available. In this regard, enhancing the monitoring of urban landscape with high-frequency method, evaluating and estimating the impacts of the urban landscape changes, and understating the driving forces of urban landscape changes can be a significant contribution for urban planning and studying. High-resolution remote sensing data has been widely applied to urban management in China. The map of urban land use map for the entire China of 2018 with 10 meters resolution has been published. However, this research focuses on the large-scale and high-resolution remote sensing land use but does not precisely focus on the seasonal change of urban covers. High-resolution remote sensing data has a long-operation cycle (e.g., Landsat 8 required 16 days for the same location), which is unable to satisfy the requirement of monitoring urban-landscape changes. On the other hand, aerial-remote or unmanned aerial vehicle (UAV) sensing are limited by the aviation-regulation and cost was hardly widely applied in the mega-cities. Moreover, those data are limited by the climate and weather conditions (e.g., cloud, fog), and those problems make capturing spatial and temporal dynamics is always a challenge for the remote sensing community. Particularly, during the rainy season, no data are available even for Sentinel Satellite data with 5 days interval. Many natural events and/or human activities drive the changes of urban covers. In this case, enhancing the monitoring of urban landscape with high-frequency method, evaluating and estimating the impacts of the urban landscape changes, and understanding the mechanism of urban landscape changes can be a significant contribution for urban planning and studying. This project aims to use the high spatiotemporal fusion of remote sensing data to create short-cycle, high-resolution remote sensing data sets for exploring the high-frequently urban cover changes. This research will enhance the long-term monitoring applicability of high spatiotemporal fusion of remote sensing data for the urban landscape for optimizing the urban management of landscape border to promoting the inclusive of the urban landscape to all communities.

Keywords: urban land cover changes, remote sensing, high spatiotemporal fusion, urban management

Procedia PDF Downloads 124
23100 Statistical Time-Series and Neural Architecture of Malaria Patients Records in Lagos, Nigeria

Authors: Akinbo Razak Yinka, Adesanya Kehinde Kazeem, Oladokun Oluwagbenga Peter

Abstract:

Time series data are sequences of observations collected over a period of time. Such data can be used to predict health outcomes, such as disease progression, mortality, hospitalization, etc. The Statistical approach is based on mathematical models that capture the patterns and trends of the data, such as autocorrelation, seasonality, and noise, while Neural methods are based on artificial neural networks, which are computational models that mimic the structure and function of biological neurons. This paper compared both parametric and non-parametric time series models of patients treated for malaria in Maternal and Child Health Centres in Lagos State, Nigeria. The forecast methods considered linear regression, Integrated Moving Average, ARIMA and SARIMA Modeling for the parametric approach, while Multilayer Perceptron (MLP) and Long Short-Term Memory (LSTM) Network were used for the non-parametric model. The performance of each method is evaluated using the Mean Absolute Error (MAE), R-squared (R2) and Root Mean Square Error (RMSE) as criteria to determine the accuracy of each model. The study revealed that the best performance in terms of error was found in MLP, followed by the LSTM and ARIMA models. In addition, the Bootstrap Aggregating technique was used to make robust forecasts when there are uncertainties in the data.

Keywords: ARIMA, bootstrap aggregation, MLP, LSTM, SARIMA, time-series analysis

Procedia PDF Downloads 74
23099 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML

Procedia PDF Downloads 128
23098 Library on the Cloud: Universalizing Libraries Based on Virtual Space

Authors: S. Vanaja, P. Panneerselvam, S. Santhanakarthikeyan

Abstract:

Cloud Computing is a latest trend in Libraries. Entering in to cloud services, Librarians can suit the present information handling and they are able to satisfy needs of the knowledge society. Libraries are now in the platform of universalizing all its information to users and they focus towards clouds which gives easiest access to data and application. Cloud computing is a highly scalable platform promising quick access to hardware and software over the internet, in addition to easy management and access by non-expert users. In this paper, we discuss the cloud’s features and its potential applications in the library and information centers, how cloud computing actually works is illustrated in this communication and how it will be implemented. It discuss about what are the needs to move to cloud, process of migration to cloud. In addition to that this paper assessed the practical problems during migration in libraries, advantages of migration process and what are the measures that Libraries should follow during migration in to cloud. This paper highlights the benefits and some concerns regarding data ownership and data security on the cloud computing.

Keywords: cloud computing, cloud-service, cloud based-ILS, cloud-providers, discovery service, IaaS, PaaS, SaaS, virtualization, Web scale access

Procedia PDF Downloads 658
23097 Deliberation of Daily Evapotranspiration and Evaporative Fraction Based on Remote Sensing Data

Authors: J. Bahrawi, M. Elhag

Abstract:

Estimation of evapotranspiration is always a major component in water resources management. Traditional techniques of calculating daily evapotranspiration based on field measurements are valid only for local scales. Earth observation satellite sensors are thus used to overcome difficulties in obtaining daily evapotranspiration measurements on regional scale. The Surface Energy Balance System (SEBS) model was adopted to estimate daily evapotranspiration and relative evaporation along with other land surface energy fluxes. The model requires agro-climatic data that improve the model outputs. Advance Along Track Scanning Radiometer (AATSR) and Medium Spectral Resolution Imaging Spectrometer (MERIS) imageries were used to estimate the daily evapotranspiration and relative evaporation over the entire Nile Delta region in Egypt supported by meteorological data collected from six different weather stations located within the study area. Daily evapotranspiration maps derived from SEBS model show a strong agreement with actual ground-truth data taken from 92 points uniformly distributed all over the study area. Moreover, daily evapotranspiration and relative evaporation are strongly correlated. The reliable estimation of daily evapotranspiration supports the decision makers to review the current land use practices in terms of water management, while enabling them to propose proper land use changes.

Keywords: daily evapotranspiration, relative evaporation, SEBS, AATSR, MERIS, Nile Delta

Procedia PDF Downloads 258
23096 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography

Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway

Abstract:

This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.

Keywords: steganography, stego, LSB, crop

Procedia PDF Downloads 267
23095 A Usability Framework to Influence the Intention to Use Mobile Fitness Applications in South Africa

Authors: Bulelani Ngamntwini, Liezel Cilliers

Abstract:

South Africa has one of the highest prevalence of obese people on the African continent. Forty-six percent of the adults in South Africa are physically inactive. Fitness applications can be used to increase physical inactivity. However, the uptake of mobile fitness applications in South Africa has been found to be poor due to usability challenges with the technology. The study developed a usability framework to influence the intention to use mobile fitness applications in South Africa. The study made use of a positivistic approach to collect data. A questionnaire was used to collect quantitative data from 377 respondents that have used mobile fitness applications in the past. A response rate of 80.90% was recorded. To analyse the data, the Pearson correlation was used to determine the relationships between the various hypotheses. There are four usability factors, efficiency, effectiveness, satisfaction, and learnability, which contribute to the intention of users to make use of mobile fitness applications. The study, therefore, recommends that for a mobile fitness application to be successful, these four factors must be considered and incorporated by developers when designing the applications.

Keywords: obese, overweight, physical inactivity, mobile fitness application, usability factors

Procedia PDF Downloads 163