Search results for: maximal data sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25883

Search results for: maximal data sets

25553 Processing Big Data: An Approach Using Feature Selection

Authors: Nikat Parveen, M. Ananthi

Abstract:

Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.

Keywords: big data, key value, feature selection, retrieval, performance

Procedia PDF Downloads 340
25552 National System of Innovation in Zambia: Towards Socioeconomic Development

Authors: Ephraim Daka, Maxim Kotsemir

Abstract:

The National system Innovation (NSI) have recently proliferated as a vehicle for addressing poverty and national competitiveness in the developing countries. While several governments in Sub-Saharan Africa have adopted the developed countries’ models of innovation to local conditions, the Zambian case is rather unique. This study highlights conceptual and socioeconomic challenges directed to the performances of the NSI. The paper analyses science and technology strategies with the inclusion of “innovation” and its effect towards improving socioeconomic elements. The authors reviewed STI policy and national strategy documents, followed by interviews compared to economical regional and national data sets. The NSI and its related to inter-linkages and support mechanism to socioeconomic development were explored.

Keywords: national system of innovation, socioeconomics, development, Zambia

Procedia PDF Downloads 223
25551 Effect of Aging on the Second Law Efficiency, Exergy Destruction and Entropy Generation in the Skeletal Muscles during Exercise

Authors: Jale Çatak, Bayram Yılmaz, Mustafa Ozilgen

Abstract:

The second law muscle work efficiency is obtained by multiplying the metabolic and mechanical work efficiencies. Thermodynamic analyses are carried out with 19 sets of arms and legs exercise data which were obtained from the healthy young people. These data are used to simulate the changes occurring during aging. The muscle work efficiency decreases with aging as a result of the reduction of the metabolic energy generation in the mitochondria. The reduction of the mitochondrial energy efficiency makes it difficult to carry out the maintenance of the muscle tissue, which in turn causes a decline of the muscle work efficiency. When the muscle attempts to produce more work, entropy generation and exergy destruction increase. Increasing exergy destruction may be regarded as the result of the deterioration of the muscles. When the exergetic efficiency is 0.42, exergy destruction becomes 1.49 folds of the work performance. This proportionality becomes 2.50 and 5.21 folds when the exergetic efficiency decreases to 0.30 and 0.17 respectively.

Keywords: aging mitochondria, entropy generation, exergy destruction, muscle work performance, second law efficiency

Procedia PDF Downloads 426
25550 Automatic Adult Age Estimation Using Deep Learning of the ResNeXt Model Based on CT Reconstruction Images of the Costal Cartilage

Authors: Ting Lu, Ya-Ru Diao, Fei Fan, Ye Xue, Lei Shi, Xian-e Tang, Meng-jun Zhan, Zhen-hua Deng

Abstract:

Accurate adult age estimation (AAE) is a significant and challenging task in forensic and archeology fields. Attempts have been made to explore optimal adult age metrics, and the rib is considered a potential age marker. The traditional way is to extract age-related features designed by experts from macroscopic or radiological images followed by classification or regression analysis. Those results still have not met the high-level requirements for practice, and the limitation of using feature design and manual extraction methods is loss of information since the features are likely not designed explicitly for extracting information relevant to age. Deep learning (DL) has recently garnered much interest in imaging learning and computer vision. It enables learning features that are important without a prior bias or hypothesis and could be supportive of AAE. This study aimed to develop DL models for AAE based on CT images and compare their performance to the manual visual scoring method. Chest CT data were reconstructed using volume rendering (VR). Retrospective data of 2500 patients aged 20.00-69.99 years were obtained between December 2019 and September 2021. Five-fold cross-validation was performed, and datasets were randomly split into training and validation sets in a 4:1 ratio for each fold. Before feeding the inputs into networks, all images were augmented with random rotation and vertical flip, normalized, and resized to 224×224 pixels. ResNeXt was chosen as the DL baseline due to its advantages of higher efficiency and accuracy in image classification. Mean absolute error (MAE) was the primary parameter. Independent data from 100 patients acquired between March and April 2022 were used as a test set. The manual method completely followed the prior study, which reported the lowest MAEs (5.31 in males and 6.72 in females) among similar studies. CT data and VR images were used. The radiation density of the first costal cartilage was recorded using CT data on the workstation. The osseous and calcified projections of the 1 to 7 costal cartilages were scored based on VR images using an eight-stage staging technique. According to the results of the prior study, the optimal models were the decision tree regression model in males and the stepwise multiple linear regression equation in females. Predicted ages of the test set were calculated separately using different models by sex. A total of 2600 patients (training and validation sets, mean age=45.19 years±14.20 [SD]; test set, mean age=46.57±9.66) were evaluated in this study. Of ResNeXt model training, MAEs were obtained with 3.95 in males and 3.65 in females. Based on the test set, DL achieved MAEs of 4.05 in males and 4.54 in females, which were far better than the MAEs of 8.90 and 6.42 respectively, for the manual method. Those results showed that the DL of the ResNeXt model outperformed the manual method in AAE based on CT reconstruction of the costal cartilage and the developed system may be a supportive tool for AAE.

Keywords: forensic anthropology, age determination by the skeleton, costal cartilage, CT, deep learning

Procedia PDF Downloads 72
25549 Axial Load Capacity of Drilled Shafts from In-Situ Test Data at Semani Site, in Albania

Authors: Neritan Shkodrani, Klearta Rrushi, Anxhela Shaha

Abstract:

Generally, the design of axial load capacity of deep foundations is based on the data provided from field tests, such as SPT (Standard Penetration Test) and CPT (Cone Penetration Test) tests. This paper reports the results of axial load capacity analysis of drilled shafts at a construction site at Semani, in Fier county, Fier prefecture in Albania. In this case, the axial load capacity analyses are based on the data of 416 SPT tests and 12 CPTU tests, which are carried out in this site construction using 12 boreholes (10 borings of a depth 30.0 m and 2 borings of a depth of 80.0m). The considered foundation widths range from 0.5m to 2.5 m and foundation embedment lengths is fixed at a value of 25m. SPT – based analytical methods from the Japanese practice of design (Building Standard Law of Japan) and CPT – based analytical Eslami and Fellenius methods are used for obtaining axial ultimate load capacity of drilled shafts. The considered drilled shaft (25m long and 0.5m - 2.5m in diameter) is analyzed for the soil conditions of each borehole. The values obtained from sets of calculations are shown in different charts. Then the reported axial load capacity values acquired from SPT and CPTU data are compared and some conclusions are found related to the mentioned methods of calculations.

Keywords: deep foundations, drilled shafts, axial load capacity, ultimate load capacity, allowable load capacity, SPT test, CPTU test

Procedia PDF Downloads 104
25548 Foreign Direct Investment and Its Impact on the Economic Growth of Emerging Economies: Does Ease of Doing Business Matter?

Authors: Mutaju Marobhe, Pastory Dickson

Abstract:

This study explores the role of Foreign Direct Investment (FDI) in stimulating economic growth of emerging economies. FDIs have been associated with higher economic growth rates in developed countries due to the presence of conducive business conditions e.g. advanced financial markets which may accelerate the rate at which FDI boosts economic growth. So this study sets out to evaluate this macroeconomic phenomenon in emerging economies using the case study of Southern Africa Development Community (SADC) countries. The study uses Ease of Doing Business Index as a variable that moderates the relationship between FDI and economic growth. Panel data ranging from 2010 to 2019 from all SADC members are used and due to the unbalanced nature of the data, fixed effects regression analysis with moderation effect is used to assess this phenomenon. The conclusions and recommendations generated by this study will enable emerging economies to depict how they can be able to significantly improve FDI’s role in accelerating economic growth similarly to developed economies.

Keywords: ease of doing business, economic growth, emerging economies, foreign direct investment

Procedia PDF Downloads 142
25547 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System

Authors: Vuk M. Popovic, Dunja D. Popovic

Abstract:

Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.

Keywords: laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs

Procedia PDF Downloads 357
25546 Biochemical and Pomological Variability among 14 Moroccan and Foreign Cultivars of Prunus dulcis

Authors: H. Hanine, H. H'ssaini, M. Ibno Alaoui, A. Nablousi, H. Zahir, S. Ennahli, H. Latrache, H. Zine Abidine

Abstract:

Biochemical and pomological variability among 14 cultivars of Prunus dulcis planted in a germoplasm collection site in Morocco were evaluated. Almond samples from six local and eight foreign cultivars (France, Italy, Spain, and USA) were characterized. Biochemical and pomological data revealed significant genetic variability among the 14 cultivars; local cultivars exhibited higher total polyphenol content. Oil content ranged from 35 to 57% among cultivars; both Texas and Toundout genotypes recorded the highest oil content. Total protein concentration from select cultivars ranged from 50 mg/g in Ferraduel to 105 mg/g in Rizlane1 cultivars. Antioxidant activity of almond samples was examined by a DPPH (1,1-diphenyl-2-picrylhydrazyl) radical-scavenging assay; the antioxidant activity varied significantly within the cultivars, with IC50 (the half-maximal inhibitory concentration) values ranging from 2.25 to 20 mg/ml. Autochthonous cultivars originated from the Oujda region exhibited higher tegument total polyphenol and amino acid content compared to others. The genotype Rizlane2 recorded the highest flavonoid content. Pomological traits revealed a large variability within the almond germplasms. The hierarchical clustering analysis of all the data regarding pomological traits distinguished two groups with some particular genotypes as distinct cultivars, and groups of cultivars as polyclone varieties. These results strongly exhibit a potential use of Moroccan-originated almonds as potential clones for future selection due to their nutritional values and pomological traits compared to well-established cultivars.

Keywords: antioxidant activity, DDPH, Moroccan almonds, Prunus dulcis

Procedia PDF Downloads 241
25545 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.

Keywords: big data, evolutionary computing, cloud, precision technologies

Procedia PDF Downloads 189
25544 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 138
25543 The Effect of Outliers on the Economic and Social Survey on Income and Living Conditions

Authors: Encarnación Álvarez, Rosa M. García-Fernández, Francisco J. Blanco-Encomienda, Juan F. Muñoz

Abstract:

The European Union Survey on Income and Living Conditions (EU-SILC) is a popular survey which provides information on income, poverty, social exclusion and living conditions of households and individuals in the European Union. The EUSILC contains variables which may contain outliers. The presence of outliers can have an impact on the measures and indicators used by the EU-SILC. In this paper, we used data sets from various countries to analyze the presence of outliers. In addition, we obtain some indicators after removing these outliers, and a comparison between both situations can be observed. Finally, some conclusions are obtained.

Keywords: poverty line, headcount index, risk of poverty, skewness coefficient

Procedia PDF Downloads 401
25542 Foreign Tourists’ Attitude toward Service Marketing Mix and Intention to Revisit in Boutique Hotel

Authors: Nattapong Techarattanased

Abstract:

This survey research aimed to study the influence of attitude in services, product, and marketing mix affected intention to revisit in boutique hotel of foreign travelers in Bangkok, Thailand. The total 400 sets of closed-ended questionnaires were utilized for conducting data from foreign tourists who come to boutique hotel and can communicate in English. The descriptive statistics and multiple regression analysis were used to analyze data. The research found that tourists’ attitude towards the service of check in and check out process, food and beverage, guest room and other facilities affected in opportunity of revisiting, recommending to others and possibility of revisiting in the future at 0.05 statistically significant levels. Tourists’ attitude towards service and marketing mix in term of people, physical evidence, price, process and channel of distribution could forecast intention to revisit in term of recommending to others and intention to revisit in the future at 0.05 statistically significant levels.

Keywords: boutique hotel, foreign tourists, intention to revisit, service marketing mix

Procedia PDF Downloads 247
25541 A Reduced Ablation Model for Laser Cutting and Laser Drilling

Authors: Torsten Hermanns, Thoufik Al Khawli, Wolfgang Schulz

Abstract:

In laser cutting as well as in long pulsed laser drilling of metals, it can be demonstrated that the ablation shape (the shape of cut faces respectively the hole shape) that is formed approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from the ultrashort pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in laser cutting and long pulse drilling of metals is identified, its underlying mechanism numerically implemented, tested and clearly confirmed by comparison with experimental data. In detail, there now is a model that allows the simulation of the temporal (pulse-resolved) evolution of the hole shape in laser drilling as well as the final (asymptotic) shape of the cut faces in laser cutting. This simulation especially requires much less in the way of resources, such that it can even run on common desktop PCs or laptops. Individual parameters can be adjusted using sliders – the simulation result appears in an adjacent window and changes in real time. This is made possible by an application-specific reduction of the underlying ablation model. Because this reduction dramatically decreases the complexity of calculation, it produces a result much more quickly. This means that the simulation can be carried out directly at the laser machine. Time-intensive experiments can be reduced and set-up processes can be completed much faster. The high speed of simulation also opens up a range of entirely different options, such as metamodeling. Suitable for complex applications with many parameters, metamodeling involves generating high-dimensional data sets with the parameters and several evaluation criteria for process and product quality. These sets can then be used to create individual process maps that show the dependency of individual parameter pairs. This advanced simulation makes it possible to find global and local extreme values through mathematical manipulation. Such simultaneous optimization of multiple parameters is scarcely possible by experimental means. This means that new methods in manufacturing such as self-optimization can be executed much faster. However, the software’s potential does not stop there; time-intensive calculations exist in many areas of industry. In laser welding or laser additive manufacturing, for example, the simulation of thermal induced residual stresses still uses up considerable computing capacity or is even not possible. Transferring the principle of reduced models promises substantial savings there, too.

Keywords: asymptotic ablation shape, interactive process simulation, laser drilling, laser cutting, metamodeling, reduced modeling

Procedia PDF Downloads 214
25540 Gnss Aided Photogrammetry for Digital Mapping

Authors: Muhammad Usman Akram

Abstract:

This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.

Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry

Procedia PDF Downloads 29
25539 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 173
25538 Optimal Rest Interval between Sets in Robot-Based Upper-Arm Rehabilitation

Authors: Virgil Miranda, Gissele Mosqueda, Pablo Delgado, Yimesker Yihun

Abstract:

Muscular fatigue affects the muscle activation that is needed for producing the desired clinical outcome. Integrating optimal muscle relaxation periods into a variety of health care rehabilitation protocols is important to maximize the efficiency of the therapy. In this study, four muscle relaxation periods (30, 60, 90, and 120 seconds) and their effectiveness in producing consistent muscle activation of the muscle biceps brachii between sets of elbow flexion and extension task was investigated among a sample of 10 subjects with no disabilities. The same resting periods were then utilized in a controlled exoskeleton-based exercise for a sample size of 5 subjects and have shown similar results. On average, the muscle activity of the biceps brachii decreased by 0.3% when rested for 30 seconds, and it increased by 1.25%, 0.76%, and 0.82% when using muscle relaxation periods of 60, 90, and 120 seconds, respectively. The preliminary results suggest that a muscle relaxation period of about 60 seconds is needed for optimal continuous muscle activation within rehabilitation regimens. Robot-based rehabilitation is good to produce repetitive tasks with the right intensity, and knowing the optimal resting period will make the automation more effective.

Keywords: rest intervals, muscle biceps brachii, robot rehabilitation, muscle fatigue

Procedia PDF Downloads 190
25537 New Hybrid Method to Model Extreme Rainfalls

Authors: Youness Laaroussi, Zine Elabidine Guennoun, Amine Amar

Abstract:

Modeling and forecasting dynamics of rainfall occurrences constitute one of the major topics, which have been largely treated by statisticians, hydrologists, climatologists and many other groups of scientists. In the same issue, we propose in the present paper a new hybrid method, which combines Extreme Values and fractal theories. We illustrate the use of our methodology for transformed Emberger Index series, constructed basing on data recorded in Oujda (Morocco). The index is treated at first by Peaks Over Threshold (POT) approach, to identify excess observations over an optimal threshold u. In the second step, we consider the resulting excess as a fractal object included in one dimensional space of time. We identify fractal dimension by the box counting. We discuss the prospect descriptions of rainfall data sets under Generalized Pareto Distribution, assured by Extreme Values Theory (EVT). We show that, despite of the appropriateness of return periods given by POT approach, the introduction of fractal dimension provides accurate interpretation results, which can ameliorate apprehension of rainfall occurrences.

Keywords: extreme values theory, fractals dimensions, peaks Over threshold, rainfall occurrences

Procedia PDF Downloads 360
25536 Structural Health Monitoring using Fibre Bragg Grating Sensors in Slab and Beams

Authors: Pierre van Tonder, Dinesh Muthoo, Kim twiname

Abstract:

Many existing and newly built structures are constructed on the design basis of the engineer and the workmanship of the construction company. However, when considering larger structures where more people are exposed to the building, its structural integrity is of great importance considering the safety of its occupants (Raghu, 2013). But how can the structural integrity of a building be monitored efficiently and effectively. This is where the fourth industrial revolution step in, and with minimal human interaction, data can be collected, analysed, and stored, which could also give an indication of any inconsistencies found in the data collected, this is where the Fibre Bragg Grating (FBG) monitoring system is introduced. This paper illustrates how data can be collected and converted to develop stress – strain behaviour and to produce bending moment diagrams for the utilisation and prediction of the structure’s integrity. Embedded fibre optic sensors were used in this study– fibre Bragg grating sensors in particular. The procedure entailed making use of the shift in wavelength demodulation technique and an inscription process of the phase mask technique. The fibre optic sensors considered in this report were photosensitive and embedded in the slab and beams for data collection and analysis. Two sets of fibre cables have been inserted, one purposely to collect temperature recordings and the other to collect strain and temperature. The data was collected over a time period and analysed used to produce bending moment diagrams to make predictions of the structure’s integrity. The data indicated the fibre Bragg grating sensing system proved to be useful and can be used for structural health monitoring in any environment. From the experimental data for the slab and beams, the moments were found to be64.33 kN.m, 64.35 kN.m and 45.20 kN.m (from the experimental bending moment diagram), and as per the idealistic (Ultimate Limit State), the data of 133 kN.m and 226.2 kN.m were obtained. The difference in values gave room for an early warning system, in other words, a reserve capacity of approximately 50% to failure.

Keywords: fibre bragg grating, structural health monitoring, fibre optic sensors, beams

Procedia PDF Downloads 138
25535 Land Cover Remote Sensing Classification Advanced Neural Networks Supervised Learning

Authors: Eiman Kattan

Abstract:

This study aims to evaluate the impact of classifying labelled remote sensing images conventional neural network (CNN) architecture, i.e., AlexNet on different land cover scenarios based on two remotely sensed datasets from different point of views such as the computational time and performance. Thus, a set of experiments were conducted to specify the effectiveness of the selected convolutional neural network using two implementing approaches, named fully trained and fine-tuned. For validation purposes, two remote sensing datasets, AID, and RSSCN7 which are publicly available and have different land covers features were used in the experiments. These datasets have a wide diversity of input data, number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in training, validation, and testing. As a result, the fully trained approach has achieved a trivial result for both of the two data sets, AID and RSSCN7 by 73.346% and 71.857% within 24 min, 1 sec and 8 min, 3 sec respectively. However, dramatic improvement of the classification performance using the fine-tuning approach has been recorded by 92.5% and 91% respectively within 24min, 44 secs and 8 min 41 sec respectively. The represented conclusion opens the opportunities for a better classification performance in various applications such as agriculture and crops remote sensing.

Keywords: conventional neural network, remote sensing, land cover, land use

Procedia PDF Downloads 369
25534 Formulating Rough Approximations in Information Tables with Possibilistic Information

Authors: Michinori Nakata, Hiroshi Sakai

Abstract:

A rough set, which consists of lower and upper approximations, is formulated in information tables containing possibilistic information. First, lower and upper approximations on the basis of possible world semantics in the same way as Lipski did in the field of incomplete databases are shown in order to clarify fundamentals of rough sets under possibilistic information. Possibility and necessity measures are used, as is done in possibilistic databases. As a result, each object has certain and possible membership degrees to lower and upper approximations, which degrees are the lower and upper bounds. Therefore, the degree that the object belongs to lower and upper approximations is expressed by an interval value. And the complementary property linked with the lower and upper approximations holds, as is valid under complete information. Second, the approach based on indiscernibility relations, which is proposed by Dubois and Prade, are extended in three cases. The first case is that objects used to approximate a set of objects are characterized by possibilistic information. The second case is that objects used to approximate a set of objects with possibilistic information are characterized by complete information. The third case is that objects that are characterized by possibilistic information approximate a set of objects with possibilistic information. The extended approach create the same results as the approach based on possible world semantics. This justifies our extension.

Keywords: rough sets, possibilistic information, possible world semantics, indiscernibility relations, lower approximations, upper approximations

Procedia PDF Downloads 321
25533 Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data

Authors: Muthukumarasamy Govindarajan

Abstract:

Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier.

Keywords: accuracy, arcing, bagging, genetic algorithm, Naive Bayes, sentiment mining, support vector machine

Procedia PDF Downloads 140
25532 River Network Delineation from Sentinel 1 Synthetic Aperture Radar Data

Authors: Christopher B. Obida, George A. Blackburn, James D. Whyatt, Kirk T. Semple

Abstract:

In many regions of the world, especially in developing countries, river network data are outdated or completely absent, yet such information is critical for supporting important functions such as flood mitigation efforts, land use and transportation planning, and the management of water resources. In this study, a method was developed for delineating river networks using Sentinel 1 imagery. Unsupervised classification was applied to multi-temporal Sentinel 1 data to discriminate water bodies from other land covers then the outputs were combined to generate a single persistent water bodies product. A thinning algorithm was then used to delineate river centre lines, which were converted into vector features and built into a topologically structured geometric network. The complex river system of the Niger Delta was used to compare the performance of the Sentinel-based method against alternative freely available water body products from United States Geological Survey, European Space Agency and OpenStreetMap and a river network derived from a Shuttle Rader Topography Mission Digital Elevation Model. From both raster-based and vector-based accuracy assessments, it was found that the Sentinel-based river network products were superior to the comparator data sets by a substantial margin. The geometric river network that was constructed permitted a flow routing analysis which is important for a variety of environmental management and planning applications. The extracted network will potentially be applied for modelling dispersion of hydrocarbon pollutants in Ogoniland, a part of the Niger Delta. The approach developed in this study holds considerable potential for generating up to date, detailed river network data for the many countries where such data are deficient.

Keywords: Sentinel 1, image processing, river delineation, large scale mapping, data comparison, geometric network

Procedia PDF Downloads 137
25531 Comparative Study of Estimators of Population Means in Two Phase Sampling in the Presence of Non-Response

Authors: Syed Ali Taqi, Muhammad Ismail

Abstract:

A comparative study of estimators of population means in two phase sampling in the presence of non-response when Unknown population means of the auxiliary variable(s) and incomplete information of study variable y as well as of auxiliary variable(s) is made. Three real data sets of University students, hospital and unemployment are used for comparison of all the available techniques in two phase sampling in the presence of non-response with the newly generalized ratio estimators.

Keywords: two-phase sampling, ratio estimator, product estimator, generalized estimators

Procedia PDF Downloads 233
25530 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance

Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.

Keywords: machine learning, MR prostate, PI-Rads 3, radiomics

Procedia PDF Downloads 186
25529 Multi-Objective Production Planning Problem: A Case Study of Certain and Uncertain Environment

Authors: Ahteshamul Haq, Srikant Gupta, Murshid Kamal, Irfan Ali

Abstract:

This case study designs and builds a multi-objective production planning model for a hardware firm with certain & uncertain data. During the time of interaction with the manager of the firm, they indicate some of the parameters may be vague. This vagueness in the formulated model is handled by the concept of fuzzy set theory. Triangular & Trapezoidal fuzzy numbers are used to represent the uncertainty in the collected data. The fuzzy nature is de-fuzzified into the crisp form using well-known defuzzification method via graded mean integration representation method. The proposed model attempts to maximize the production of the firm, profit related to the manufactured items & minimize the carrying inventory costs in both certain & uncertain environment. The recommended optimal plan is determined via fuzzy programming approach, and the formulated models are solved by using optimizing software LINGO 16.0 for getting the optimal production plan. The proposed model yields an efficient compromise solution with the overall satisfaction of decision maker.

Keywords: production planning problem, multi-objective optimization, fuzzy programming, fuzzy sets

Procedia PDF Downloads 212
25528 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network

Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse

Abstract:

Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.

Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM

Procedia PDF Downloads 236
25527 Knowledge and Use of Computer Application Packages by Office Managers/Secretaries in Higher Institutions in Ogun State Nigeria: Implication on Performance Enhancement

Authors: Charlotte Bose Iro-Idoro, Adebisi Folake Osore, Tajudeen Adisa Jimoh

Abstract:

All changes in the office environment were and are still driven by advances in technology. The impact of computers on office work has resulted in numerous changes in office activities, procedures and the expectations from office managers and secretaries. This study investigated the level of knowledge and use of computer office application packages by secretaries and office managers in higher educational institutions in Ogun State and the implications of these on their performance enhancement. The study is an ex post facto research and adopted the survey design for the collection of data. Two hypotheses were formulated, and a questionnaire was developed and tested at 0.05 level of significance. All office managers and secretaries in the service of higher educational institutions in Ogun State, Nigeria formed the population of the study. The study was limited to federal institutions and a total of 120 office managers/secretaries were selected to form the sample such that 40 office managers/secretaries were randomly selected from each of the three Federal higher institutions in the State, that is Federal University of Agriculture, Abeokuta, Federal Polytechnic, Ilaro and Federal College of Education, Osiele, Abeokuta, Ogun State. Analysis of data and hypotheses tests were carried out with frequency counts, percentage and T-Test. The result indicated varying levels of awareness on office application tools with limited knowledge and use of computer application packages by office managers/secretaries. The results also showed that good knowledge and high use of office application tools enhance performance of office managers/secretaries. The study recommended that there should be maximum institutional resources and support and personal development on the part of the office managers to ensure update knowledge and maximal use of office application tools by office managers/secretaries.

Keywords: application packages, computer, office managers, performance enhancement

Procedia PDF Downloads 179
25526 A Low-Area Fully-Reconfigurable Hardware Design of Fast Fourier Transform System for 3GPP-LTE Standard

Authors: Xin-Yu Shih, Yue-Qu Liu, Hong-Ru Chou

Abstract:

This paper presents a low-area and fully-reconfigurable Fast Fourier Transform (FFT) hardware design for 3GPP-LTE communication standard. It can fully support 32 different FFT sizes, up to 2048 FFT points. Besides, a special processing element is developed for making reconfigurable computing characteristics possible, while first-in first-out (FIFO) scheduling scheme design technique is proposed for hardware-friendly FIFO resource arranging. In a synthesis chip realization via TSMC 40 nm CMOS technology, the hardware circuit only occupies core area of 0.2325 mm2 and dissipates 233.5 mW at maximal operating frequency of 250 MHz.

Keywords: reconfigurable, fast Fourier transform (FFT), single-path delay feedback (SDF), 3GPP-LTE

Procedia PDF Downloads 277
25525 The Administration of Infection Diseases During the Pandemic COVID-19 and the Role of the Differential Diagnosis with Biomarkers VB10

Authors: Sofia Papadimitriou

Abstract:

INTRODUCTION: The differential diagnosis between acute viral and bacterial infections is an important cost-effectiveness parameter at the stage of the treatment process in order to achieve the maximum benefits in therapeutic intervention by combining the minimum cost to ensure the proper use of antibiotics.The discovery of sensitive and robust molecular diagnostic tests in response to the role of the host in infections has enhanced the accurate diagnosis and differentiation of infections. METHOD: The study used a sample of six independent blood samples (total=756) which are associated with human proteins-proteins, each of which at the transcription stage expresses a different response in the host network between viral and bacterial infections.Τhe individual blood samples are subjected to a sequence of computer filters that identify a gene panel corresponding to an autonomous diagnostic score. The data set and the correspondence of the gene panel to the diagnostic patents a new Bangalore -Viral Bacterial (BL-VB). FINDING: We use a biomarker based on the blood of 10 genes(Panel-VB) that are an important prognostic value for the detection of viruses from bacterial infections with a weighted average AUROC of 0.97(95% CL:0.96-0.99) in eleven independent samples (sets n=898). We discovered a base with a patient score (VB 10 ) according to the table, which is a significant diagnostic value with a weighted average of AUROC 0.94(95% CL: 0.91-0.98) in 2996 patient samples from 56 public sets of data from 19 different countries. We also studied VB 10 in a new cohort of South India (BL-VB,n=56) and found 97% accuracy in confirmed cases of viral and bacterial infections. We found that VB 10 (a)accurately identifies the type of infection even in unspecified cases negative to the culture (b) shows its clinical condition recovery and (c) applies to all age groups, covering a wide range of acute bacterial and viral infectious, including non-specific pathogens. We applied our VB 10 rating to publicly available COVID 19 data and found that our rating diagnosed viral infection in patient samples. RESULTS: Τhe results of the study showed the diagnostic power of the biomarker VB 10 as a diagnostic test for the accurate diagnosis of acute infections in recovery conditions. We look forward to helping you make clinical decisions about prescribing antibiotics and integrating them into your policies management of antibiotic stewardship efforts. CONCLUSIONS: Overall, we are developing a new property of the RNA-based biomarker and a new blood test to differentiate between viral and bacterial infections to assist a physician in designing the optimal treatment regimen to contribute to the proper use of antibiotics and reduce the burden on antimicrobial resistance, AMR.

Keywords: acute infections, antimicrobial resistance, biomarker, blood transcriptome, systems biology, classifier diagnostic score

Procedia PDF Downloads 155
25524 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life

Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar

Abstract:

In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.

Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home

Procedia PDF Downloads 111