Search results for: continuous data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26736

Search results for: continuous data

24636 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems

Authors: Z. Bouattou, R. Laurini, H. Belbachir

Abstract:

This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.

Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems

Procedia PDF Downloads 401
24635 Need for Privacy in the Technological Era: An Analysis in the Indian Perspective

Authors: Amrashaa Singh

Abstract:

In the digital age and the large cyberspace, Data Protection and Privacy have become major issues in this technological era. There was a time when social media and online shopping websites were treated as a blessing for the people. But now the tables have turned, and the people have started to look at them with suspicion. They are getting aware of the privacy implications, and they do not feel as safe as they used to initially. When Edward Snowden informed the world about the snooping United States Security Agencies had been doing, that is when the picture became clear for the people. After the Cambridge Analytica case where the data of Facebook users were stored without their consent, the doubts arose in the minds of people about how safe they actually are. In India, the case of spyware Pegasus also raised a lot of concerns. It was used to snoop on a lot of human right activists and lawyers and the company which invented the spyware claims that it only sells it to the government. The paper will be dealing with the privacy concerns in the Indian perspective with an analytical methodology. The Supreme Court here had recently declared a right to privacy a Fundamental Right under Article 21 of the Constitution of India. Further, the Government is also working on the Data Protection Bill. The point to note is that India is still a developing country, and with the bill, the government aims at data localization. But there are doubts in the minds of many people that the Government would actually be snooping on the data of the individuals. It looks more like an attempt to curb dissenters ‘lawfully’. The focus of the paper would be on these issues in India in light of the European Union (EU) General Data Protection Regulation (GDPR). The Indian Data Protection Bill is also said to be loosely based on EU GDPR. But how helpful would these laws actually be is another concern since the economic and social conditions in both countries are very different? The paper aims at discussing these concerns, how good or bad is the intention of the government behind the bill, and how the nations can act together and draft common regulations so that there is some uniformity in the laws and their application.

Keywords: Article 21, data protection, dissent, fundamental right, India, privacy

Procedia PDF Downloads 114
24634 An Online 3D Modeling Method Based on a Lossless Compression Algorithm

Authors: Jiankang Wang, Hongyang Yu

Abstract:

This paper proposes a portable online 3D modeling method. The method first utilizes a depth camera to collect data and compresses the depth data using a frame-by-frame lossless data compression method. The color image is encoded using the H.264 encoding format. After the cloud obtains the color image and depth image, a 3D modeling method based on bundlefusion is used to complete the 3D modeling. The results of this study indicate that this method has the characteristics of portability, online, and high efficiency and has a wide range of application prospects.

Keywords: 3D reconstruction, bundlefusion, lossless compression, depth image

Procedia PDF Downloads 82
24633 H∞ Sampled-Data Control for Linear Systems Time-Varying Delays: Application to Power System

Authors: Chang-Ho Lee, Seung-Hoon Lee, Myeong-Jin Park, Oh-Min Kwon

Abstract:

This paper investigates improved stability criteria for sampled-data control of linear systems with disturbances and time-varying delays. Based on Lyapunov-Krasovskii stability theory, delay-dependent conditions sufficient to ensure H∞ stability for the system are derived in the form of linear matrix inequalities(LMI). The effectiveness of the proposed method will be shown in numerical examples.

Keywords: sampled-data control system, Lyapunov-Krasovskii functional, time delay-dependent, LMI, H∞ control

Procedia PDF Downloads 320
24632 Integrated Coastal Management for the Sustainable Development of Coastal Cities: The Case of El-Mina, Tripoli, Lebanon

Authors: G. Ghamrawi, Y. Abunnasr, M. Fawaz, S. Yazigi

Abstract:

Coastal cities are constantly exposed to environmental degradation and economic regression fueled by rapid and uncontrolled urban growth as well as continuous resource depletion. This is the case of the City of Mina in Tripoli (Lebanon), where lack of awareness to preserve social, ecological, and historical assets, coupled with the increasing development pressures, are threatening the socioeconomic status of the city residents, the quality of life and accessibility to the coast. To address these challenges, a holistic coastal urban design and planning approach was developed to analyze the environmental, political, legal, and socioeconomic context of the city. This approach aims to investigate the potential of balancing urban development with the protection and enhancement of cultural, ecological, and environmental assets under an integrated coastal zone management approach (ICZM). The analysis of Mina's different sectors adopted several tools that include direct field observation, interviews with stakeholders, analysis of available data, historical maps, and previously proposed projects. The findings from the analysis were mapped and graphically represented, allowing the recognition of character zones that become the design intervention units. Consequently, the thesis proposes an urban, city-scale intervention that identifies 6 different character zones (the historical fishing port, Abdul Wahab island, the abandoned Port Said, Hammam el Makloub, the sand beach, and the new developable area) and proposes context-specific design interventions that capitalize on the main characteristics of each zone. Moreover, the intervention builds on the institutional framework of ICZM as well as other studies previously conducted for the coast and adopts nature-based solutions with hybrid systems for providing better environmental design solutions for developing the coast. This enables the realization of an all-inclusive, well-connected shoreline with easy and free access towards the sea; a developed shoreline with an active local economy, and an improved urban environment.

Keywords: blue green infrastructure, coastal cities, hybrid solutions, integrated coastal zone management, sustainable development, urban planning

Procedia PDF Downloads 156
24631 Logistics Information Systems in the Distribution of Flour in Nigeria

Authors: Cornelius Femi Popoola

Abstract:

This study investigated logistics information systems in the distribution of flour in Nigeria. A case study design was used and 50 staff of Honeywell Flour Mill was sampled for the study. Data generated through a questionnaire were analysed using correlation and regression analysis. The findings of the study revealed that logistic information systems such as e-commerce, interactive telephone systems and electronic data interchange positively correlated with the distribution of flour in Honeywell Flour Mill. Finding also deduced that e-commerce, interactive telephone systems and electronic data interchange jointly and positively contribute to the distribution of flour in Honeywell Flour Mill in Nigeria (R = .935; Adj. R2 = .642; F (3,47) = 14.739; p < .05). The study therefore recommended that Honeywell Flour Mill should upgrade their logistic information systems to computer-to-computer communication of business transactions and documents, as well adopt new technology such as, tracking-and-tracing systems (barcode scanning for packages and palettes), tracking vehicles with Global Positioning System (GPS), measuring vehicle performance with ‘black boxes’ (containing logistic data), and Automatic Equipment Identification (AEI) into their systems.

Keywords: e-commerce, electronic data interchange, flour distribution, information system, interactive telephone systems

Procedia PDF Downloads 553
24630 Cascaded Neural Network for Internal Temperature Forecasting in Induction Motor

Authors: Hidir S. Nogay

Abstract:

In this study, two systems were created to predict interior temperature in induction motor. One of them consisted of a simple ANN model which has two layers, ten input parameters and one output parameter. The other one consisted of eight ANN models connected each other as cascaded. Cascaded ANN system has 17 inputs. Main reason of cascaded system being used in this study is to accomplish more accurate estimation by increasing inputs in the ANN system. Cascaded ANN system is compared with simple conventional ANN model to prove mentioned advantages. Dataset was obtained from experimental applications. Small part of the dataset was used to obtain more understandable graphs. Number of data is 329. 30% of the data was used for testing and validation. Test data and validation data were determined for each ANN model separately and reliability of each model was tested. As a result of this study, it has been understood that the cascaded ANN system produced more accurate estimates than conventional ANN model.

Keywords: cascaded neural network, internal temperature, inverter, three-phase induction motor

Procedia PDF Downloads 345
24629 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level

Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown

Abstract:

‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.

Keywords: data integration, data linkage, health planning, health services research

Procedia PDF Downloads 216
24628 Spatial Variability of Brahmaputra River Flow Characteristics

Authors: Hemant Kumar

Abstract:

Brahmaputra River is known according to the Hindu mythology the son of the Lord Brahma. According to this name, the river Brahmaputra creates mass destruction during the monsoon season in Assam, India. It is a state situated in North-East part of India. This is one of the essential states out of the seven countries of eastern India, where almost all entire Brahmaputra flow carried out. The other states carry their tributaries. In the present case study, the spatial analysis performed in this specific case the number of MODIS data are acquired. In the method of detecting the change, the spray content was found during heavy rainfall and in the flooded monsoon season. By this method, particularly the analysis over the Brahmaputra outflow determines the flooded season. The charged particle-associated in aerosol content genuinely verifies the heavy water content below the ground surface, which is validated by trend analysis through rainfall spectrum data. This is confirmed by in-situ sampled view data from a different position of Brahmaputra River. Further, a Hyperion Hyperspectral 30 m resolution data were used to scan the sediment deposits, which is also confirmed by in-situ sampled view data from a different position.

Keywords: aerosol, change detection, spatial analysis, trend analysis

Procedia PDF Downloads 147
24627 Attenuation of Pancreatic Histology, Hematology and Biochemical Parameters in Type 2 Diabetic Rats Treated with Azadirachta excelsa

Authors: S. Nurdiana, A. S. Nor Haziqah, M. K. Nur Ezwa Khairunnisa, S. Nurul Izzati, Y. Siti Amna M. J. Norashirene, I. Nur Hilwani

Abstract:

Azadirachta excelsa or locally known as sentang are frequently used as a traditional medicine by diabetes patients in Malaysia. However, less attention has been given to their toxicity effect. Thus, the study is an attempt to examine the protective effect of A. excelsa on the pancreas and to determine possible toxicity mediated by the extract. Diabetes was induced experimentally in rats by high-fat-diet for 16 weeks followed by intraperitoneal injection of streptozotocin at dosage of 35 mg/kg of body weight. Declination of the fasting blood glucose level was observed after continuous administration of A. excelsa for 14 days twice daily. This is due to the refining structure of the pancreas. However, surprisingly, the plant extract reduced the leukocytes, erythrocytes, hemoglobin, MCHC and lymphocytes. In addition, the rat treated with the plant extract exhibited increment in AST and eosinocytes level. Overall, the finding shows that A. excelsa possesses antidiabetic activity by improving the structure of pancreatic islet of Langerhans but involved in ameliorating of hematology and biochemical parameters.

Keywords: Azadirachta excelsa, diabetes, pancreas, hemato-biochemical parameters

Procedia PDF Downloads 418
24626 Data Mining Model for Predicting the Status of HIV Patients during Drug Regimen Change

Authors: Ermias A. Tegegn, Million Meshesha

Abstract:

Human Immunodeficiency Virus and Acquired Immunodeficiency Syndrome (HIV/AIDS) is a major cause of death for most African countries. Ethiopia is one of the seriously affected countries in sub Saharan Africa. Previously in Ethiopia, having HIV/AIDS was almost equivalent to a death sentence. With the introduction of Antiretroviral Therapy (ART), HIV/AIDS has become chronic, but manageable disease. The study focused on a data mining technique to predict future living status of HIV/AIDS patients at the time of drug regimen change when the patients become toxic to the currently taking ART drug combination. The data is taken from University of Gondar Hospital ART program database. Hybrid methodology is followed to explore the application of data mining on ART program dataset. Data cleaning, handling missing values and data transformation were used for preprocessing the data. WEKA 3.7.9 data mining tools, classification algorithms, and expertise are utilized as means to address the research problem. By using four different classification algorithms, (i.e., J48 Classifier, PART rule induction, Naïve Bayes and Neural network) and by adjusting their parameters thirty-two models were built on the pre-processed University of Gondar ART program dataset. The performances of the models were evaluated using the standard metrics of accuracy, precision, recall, and F-measure. The most effective model to predict the status of HIV patients with drug regimen substitution is pruned J48 decision tree with a classification accuracy of 98.01%. This study extracts interesting attributes such as Ever taking Cotrim, Ever taking TbRx, CD4 count, Age, Weight, and Gender so as to predict the status of drug regimen substitution. The outcome of this study can be used as an assistant tool for the clinician to help them make more appropriate drug regimen substitution. Future research directions are forwarded to come up with an applicable system in the area of the study.

Keywords: HIV drug regimen, data mining, hybrid methodology, predictive model

Procedia PDF Downloads 142
24625 Internal Cycles from Hydrometric Data and Variability Detected Through Hydrological Modelling Results, on the Niger River, over 1901-2020

Authors: Salif Koné

Abstract:

We analyze hydrometric data at the Koulikoro station on the Niger River; this basin drains 120600 km2 and covers three countries in West Africa, Guinea, Mali, and Ivory Coast. Two subsequent decadal cycles are highlighted (1925-1936 and 1929-1939) instead of the presumed single decadal one from literature. Moreover, the observed hydrometric data shows a multidecadal 40-year period that is confirmed when graphing a spatial coefficient of variation of runoff over decades (starting at 1901-1910). Spatial runoff data are produced on 48 grids (0.5 degree by 0.5 degree) and through semi-distributed versions of both SimulHyd model and GR2M model - variants of a French Hydrologic model – standing for Genie Rural of 2 parameters at monthly time step. Both extremal decades in terms of runoff coefficient of variation are confronted: 1951-1960 has minimal coefficient of variation, and 1981-1990 shows the maximal value of it during the three months of high-water level (August, September, and October). The mapping of the relative variation of these two decadal situations allows hypothesizing as following: the scale of variation between both extremal situations could serve to fix boundary conditions for further simulations using data from climate scenario.

Keywords: internal cycles, hydrometric data, niger river, gr2m and simulhyd framework, runoff coefficient of variation

Procedia PDF Downloads 95
24624 The Incidence of Postoperative Atrial Fibrillation after Coronary Artery Bypass Grafting in Patients with Local and Diffuse Coronary Artery Disease

Authors: Kamil Ganaev, Elina Vlasova, Andrei Shiryaev, Renat Akchurin

Abstract:

De novo atrial fibrillation (AF) after coronary artery bypass grafting (CABG) is a common complication. To date, there are no data on the possible effect of diffuse lesions of coronary arteries on the incidence of postoperative AF complications. Methods. Patients operated on-pump under hypothermic conditions during the calendar year (2020) were studied. Inclusion criteria - isolated CABG and achievement of complete myocardial revascularization. Patients with a history of AF moderate and severe valve dysfunction, hormonal thyroid pathology, initial CHF(Congestive heart failure), as well as patients with developed perioperative complications (IM, acute heart failure, massive blood loss) and deceased were excluded. Thus 227 patients were included; mean age 65±9 years; 69% were men. 89% of patients had a 3-vessel lesion of the coronary artery; the remainder had a 2-vessel lesion. Mean LV size: 3.9±0.3 cm, indexed LV volume: 29.4±5.3 mL/m2. Two groups were considered: D (n=98), patients with diffuse coronary heart disease, and L (n=129), patients with local coronary heart disease. Clinical and demographic characteristics in the groups were comparable. Rhythm assessment: continuous bedside ECG monitoring up to 5 days; ECG CT at 5-7 days after CABG; daily routine ECG registration. Follow-up period - postoperative hospital period. Results. The Median follow-up period was 9 (7;11) days. POFP (Postoperative atrial fibrillation) was detected in 61/227 (27%) patients: 34/98 (35%) in group D versus 27/129 (21%) in group L; p<0.05. Moreover, the values of revascularization index in groups D and L (3.9±0.7 and 3.8±0.5, respectively) were equal, and the mean time Cardiopulmonary bypass (CPB) (107±27 and 80±13min), as well as the mean ischemic time (67±17 and 55±11min) were significantly longer in group D (p<0.05). However, a separate analysis of these parameters in patients with and without developed AF did not reveal any significant differences in group D (CPB time 99±21.2 min, ischemic time 63±12.2 min), or in group L (CPB time 88±13.1 min, ischemic time 58.7±13.2 min). Conclusion. With the diffuse nature of coronary lesions, the incidence of AF in the hospital period after isolated CABG definitely increases. To better understand the role of severe coronary atherosclerosis in the development of POAF, it is necessary to distinguish the influence of organic features of atrial and ventricular myocardium (as a consequence of chronic coronary disease) from the features of surgical correction in diffuse coronary lesions.

Keywords: atrial fibrillation, diffuse coronary artery disease, coronary artery bypass grafting, local coronary artery disease

Procedia PDF Downloads 212
24623 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps

Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam

Abstract:

GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.

Keywords: noise, image, GIS, digital map, inpainting

Procedia PDF Downloads 352
24622 Towards Automatic Calibration of In-Line Machine Processes

Authors: David F. Nettleton, Elodie Bugnicourt, Christian Wasiak, Alejandro Rosales

Abstract:

In this presentation, preliminary results are given for the modeling and calibration of two different industrial winding MIMO (Multiple Input Multiple Output) processes using machine learning techniques. In contrast to previous approaches which have typically used ‘black-box’ linear statistical methods together with a definition of the mechanical behavior of the process, we use non-linear machine learning algorithms together with a ‘white-box’ rule induction technique to create a supervised model of the fitting error between the expected and real force measures. The final objective is to build a precise model of the winding process in order to control de-tension of the material being wound in the first case, and the friction of the material passing through the die, in the second case. Case 1, Tension Control of a Winding Process. A plastic web is unwound from a first reel, goes over a traction reel and is rewound on a third reel. The objectives are: (i) to train a model to predict the web tension and (ii) calibration to find the input values which result in a given tension. Case 2, Friction Force Control of a Micro-Pullwinding Process. A core+resin passes through a first die, then two winding units wind an outer layer around the core, and a final pass through a second die. The objectives are: (i) to train a model to predict the friction on die2; (ii) calibration to find the input values which result in a given friction on die2. Different machine learning approaches are tested to build models, Kernel Ridge Regression, Support Vector Regression (with a Radial Basis Function Kernel) and MPART (Rule Induction with continuous value as output). As a previous step, the MPART rule induction algorithm was used to build an explicative model of the error (the difference between expected and real friction on die2). The modeling of the error behavior using explicative rules is used to help improve the overall process model. Once the models are built, the inputs are calibrated by generating Gaussian random numbers for each input (taking into account its mean and standard deviation) and comparing the output to a target (desired) output until a closest fit is found. The results of empirical testing show that a high precision is obtained for the trained models and for the calibration process. The learning step is the slowest part of the process (max. 5 minutes for this data), but this can be done offline just once. The calibration step is much faster and in under one minute obtained a precision error of less than 1x10-3 for both outputs. To summarize, in the present work two processes have been modeled and calibrated. A fast processing time and high precision has been achieved, which can be further improved by using heuristics to guide the Gaussian calibration. Error behavior has been modeled to help improve the overall process understanding. This has relevance for the quick optimal set up of many different industrial processes which use a pull-winding type process to manufacture fibre reinforced plastic parts. Acknowledgements to the Openmind project which is funded by Horizon 2020 European Union funding for Research & Innovation, Grant Agreement number 680820

Keywords: data model, machine learning, industrial winding, calibration

Procedia PDF Downloads 241
24621 Evaluation of Urban Parks Based on POI Data: Taking Futian District of Shenzhen as an Example

Authors: Juanling Lin

Abstract:

The construction of urban parks is an important part of eco-city construction, and the intervention of big data provides a more scientific and rational platform for the assessment of urban parks by identifying and correcting the irrationality of urban park planning from the macroscopic level and then promoting the rational planning of urban parks. The study builds an urban park assessment system based on urban road network data and POI data, taking Futian District of Shenzhen as the research object, and utilizes the GIS geographic information system to assess the park system of Futian District in five aspects: park spatial distribution, accessibility, service capacity, demand, and supply-demand relationship. The urban park assessment system can effectively reflect the current situation of urban park construction and provide a useful exploration for realizing the rationality and fairness of urban park planning.

Keywords: urban parks, assessment system, POI, supply and demand

Procedia PDF Downloads 42
24620 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Model

Authors: Alam Ali, Ashok Kumar Pathak

Abstract:

Path analysis is a statistical technique used to evaluate the strength of the direct and indirect effects of variables. One or more structural regression equations are used to estimate a series of parameters in order to find the better fit of data. Sometimes, exogenous variables do not show a significant strength of their direct and indirect effect when the assumption of classical regression (ordinary least squares (OLS)) are violated by the nature of the data. The main motive of this article is to investigate the efficacy of the copula-based regression approach over the classical regression approach and calculate the direct and indirect effects of variables when data violates the OLS assumption and variables are linked through an elliptical copula. We perform this study using a well-organized numerical scheme. Finally, a real data application is also presented to demonstrate the performance of the superiority of the copula approach.

Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique

Procedia PDF Downloads 72
24619 An Exploratory Study into the Suggestive Impact of Alaa Al-Aswany's Political Essays

Authors: Valerii Dudin

Abstract:

With the continuous increase in quantity and importance of the information surrounding our daily lives, it has become crucial to understand what makes information stand out and affect our point of view, regardless of the accuracy of the facts involved. Alaa Al-Aswany’s numerous works have been an inspiration for millions of his readers in Egypt and all across the Arab World. While highly factual, the author’s political essays are both lexically and stylistically rich; they also implement descriptive allusions and proverbs to support the presented opinions. We have undertaken an effort to explore the impact on the individual perception through these political works of the author. In this study, we have overviewed previously made research on similar subjects and through contextual, intertextual, linguistic and corpus analyses we have come to realize the presence of suggestive themes in these works, capable of shaping the reader’s perception regarding a certain topic, specifically targeting the reader’s emotional bias. The findings presented in the study will reveal an overview of such examples of suggestive elements used in the author’s works, as well as various new insights on what can be considered suggestive in the context of modern Arabic printed press.

Keywords: Alaa al-Aswany, cognitive linguistics, political essays, suggestion

Procedia PDF Downloads 157
24618 Optimizing Quantum Machine Learning with Amplitude and Phase Encoding Techniques

Authors: Om Viroje

Abstract:

Quantum machine learning represents a frontier in computational technology, promising significant advancements in data processing capabilities. This study explores the significance of data encoding techniques, specifically amplitude and phase encoding, in this emerging field. By employing a comparative analysis methodology, the research evaluates how these encoding techniques affect the accuracy, efficiency, and noise resilience of quantum algorithms. Our findings reveal that amplitude encoding enhances algorithmic accuracy and noise tolerance, whereas phase encoding significantly boosts computational efficiency. These insights are crucial for developing robust quantum frameworks that can be effectively applied in real-world scenarios. In conclusion, optimizing encoding strategies is essential for advancing quantum machine learning, potentially transforming various industries through improved data processing and analysis.

Keywords: quantum machine learning, data encoding, amplitude encoding, phase encoding, noise resilience

Procedia PDF Downloads 16
24617 Reversible Information Hitting in Encrypted JPEG Bitstream by LSB Based on Inherent Algorithm

Authors: Vaibhav Barve

Abstract:

Reversible information hiding has drawn a lot of interest as of late. Being reversible, we can restore unique computerized data totally. It is a plan where mystery data is put away in digital media like image, video, audio to maintain a strategic distance from unapproved access and security reason. By and large JPEG bit stream is utilized to store this key data, first JPEG bit stream is encrypted into all around sorted out structure and then this secret information or key data is implanted into this encrypted region by marginally changing the JPEG bit stream. Valuable pixels suitable for information implanting are computed and as indicated by this key subtle elements are implanted. In our proposed framework we are utilizing RC4 algorithm for encrypting JPEG bit stream. Encryption key is acknowledged by framework user which, likewise, will be used at the time of decryption. We are executing enhanced least significant bit supplanting steganography by utilizing genetic algorithm. At first, the quantity of bits that must be installed in a guaranteed coefficient is versatile. By utilizing proper parameters, we can get high capacity while ensuring high security. We are utilizing logistic map for shuffling of bits and utilization GA (Genetic Algorithm) to find right parameters for the logistic map. Information embedding key is utilized at the time of information embedding. By utilizing precise picture encryption and information embedding key, the beneficiary can, without much of a stretch, concentrate the incorporated secure data and totally recoup the first picture and also the original secret information. At the point when the embedding key is truant, the first picture can be recouped pretty nearly with sufficient quality without getting the embedding key of interest.

Keywords: data embedding, decryption, encryption, reversible data hiding, steganography

Procedia PDF Downloads 288
24616 Study on Measuring Method and Experiment of Arc Fault Detection Device

Authors: Yang Jian-Hong, Zhang Ren-Cheng, Huang Li

Abstract:

Arc fault is one of the main inducements of electric fires. Arc Fault Detection Device (AFDD) can detect arc fault effectively. Arc fault detections and unhooking standards are the keys to AFDD practical application. First, an arc fault continuous production system was developed, which could count the arc half wave number. Then, Combining with the UL1699 standard, ignition probability curve of cotton and unhooking time of various currents intensity were obtained by experiments. The combustion degree of arc fault could be expressed effectively by arc area. Experiments proved that electric fires would be misjudged or missed only using arc half wave number as AFDD unhooking basis. At last, Practical tests were carried out on the self-developed AFDD system. The result showed that actual AFDD unhooking time was the sum of arc half wave cycling number, Arc wave identification time and unhooking mechanical operation time And the first two shared shorter time. Unhooking time standard depended on the shortest mechanical operation time.

Keywords: arc fault detection device, arc area, arc half wave, unhooking time, arc fault

Procedia PDF Downloads 509
24615 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET

Authors: Tyler T. Procko, Steve Collins

Abstract:

New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.

Keywords: API data access, database, JSON, .NET core, SQL server

Procedia PDF Downloads 66
24614 Blockchain for IoT Security and Privacy in Healthcare Sector

Authors: Umair Shafique, Hafiz Usman Zia, Fiaz Majeed, Samina Naz, Javeria Ahmed, Maleeha Zainab

Abstract:

The Internet of Things (IoT) has become a hot topic for the last couple of years. This innovative technology has shown promising progress in various areas, and the world has witnessed exponential growth in multiple application domains. Researchers are working to investigate its aptitudes to get the best from it by harnessing its true potential. But at the same time, IoT networks open up a new aspect of vulnerability and physical threats to data integrity, privacy, and confidentiality. It's is due to centralized control, data silos approach for handling information, and a lack of standardization in the IoT networks. As we know, blockchain is a new technology that involves creating secure distributed ledgers to store and communicate data. Some of the benefits include resiliency, integrity, anonymity, decentralization, and autonomous control. The potential for blockchain technology to provide the key to managing and controlling IoT has created a new wave of excitement around the idea of putting that data back into the hands of the end-users. In this manuscript, we have proposed a model that combines blockchain and IoT networks to address potential security and privacy issues in the healthcare domain. Then we try to describe various application areas, challenges, and future directions in the healthcare sector where blockchain platforms merge with IoT networks.

Keywords: IoT, blockchain, cryptocurrency, healthcare, consensus, data

Procedia PDF Downloads 180
24613 Critical Success Factors of OCOP Business Model in Pattani Province, Thailand: A Qualitative Approach

Authors: Poonsuck Thatchaopas, Nik Kamariah Nik Mat, Nattakarn Eakuru

Abstract:

“One College One Product” OCOP business model is launched by the Vocational Education Commission to encourage college students to choose at least one product for business venture. However, the number of successful OCOP projects is still minimal. The objective of this paper is to identify the critical success factors needed to be a successful OCOP business entrepreneur. This study uses qualitative method by interviewing business partners of an OCOP business called Crispy Roti Krua Acheeva Brand (CRKAB). This project was initiated by three female alumni students of the CRKAB. The finding shows that the main critical success factors are self-confidence, creativity or innovativeness, knowledge, skills and perseverance. Additionally, they reiterated that the keys to business success are product quality, perceived price, promotion, branding, new packaging to increase sales and continuous developments. The results implies for a business SME to be successful, the company should have credible partners and effective marketing plan.

Keywords: new entrepreneurship student model, business incubator, food industry, Pattani Province, Thailand

Procedia PDF Downloads 379
24612 Vision-Based Daily Routine Recognition for Healthcare with Transfer Learning

Authors: Bruce X. B. Yu, Yan Liu, Keith C. C. Chan

Abstract:

We propose to record Activities of Daily Living (ADLs) of elderly people using a vision-based system so as to provide better assistive and personalization technologies. Current ADL-related research is based on data collected with help from non-elderly subjects in laboratory environments and the activities performed are predetermined for the sole purpose of data collection. To obtain more realistic datasets for the application, we recorded ADLs for the elderly with data collected from real-world environment involving real elderly subjects. Motivated by the need to collect data for more effective research related to elderly care, we chose to collect data in the room of an elderly person. Specifically, we installed Kinect, a vision-based sensor on the ceiling, to capture the activities that the elderly subject performs in the morning every day. Based on the data, we identified 12 morning activities that the elderly person performs daily. To recognize these activities, we created a HARELCARE framework to investigate into the effectiveness of existing Human Activity Recognition (HAR) algorithms and propose the use of a transfer learning algorithm for HAR. We compared the performance, in terms of accuracy, and training progress. Although the collected dataset is relatively small, the proposed algorithm has a good potential to be applied to all daily routine activities for healthcare purposes such as evidence-based diagnosis and treatment.

Keywords: daily activity recognition, healthcare, IoT sensors, transfer learning

Procedia PDF Downloads 132
24611 Influence of Pretreatment Magnetic Resonance Imaging on Local Therapy Decisions in Intermediate-Risk Prostate Cancer Patients

Authors: Christian Skowronski, Andrew Shanholtzer, Brent Yelton, Muayad Almahariq, Daniel J. Krauss

Abstract:

Prostate cancer has the third highest incidence rate and is the second leading cause of cancer death for men in the United States. Of the diagnostic tools available for intermediate-risk prostate cancer, magnetic resonance imaging (MRI) provides superior soft tissue delineation serving as a valuable tool for both diagnosis and treatment planning. Currently, there is minimal data regarding the practical utility of MRI for evaluation of intermediate-risk prostate cancer. As such, the National Comprehensive Cancer Network’s guidelines indicate MRI as optional in intermediate-risk prostate cancer evaluation. This project aims to elucidate whether MRI affects radiation treatment decisions for intermediate-risk prostate cancer. This was a retrospective study evaluating 210 patients with intermediate-risk prostate cancer, treated with definitive radiotherapy at our institution between 2019-2020. NCCN risk stratification criteria were used to define intermediate-risk prostate cancer. Patients were divided into two groups: those with pretreatment prostate MRI, and those without pretreatment prostate MRI. We compared the use of external beam radiotherapy, brachytherapy alone, brachytherapy boost, and androgen depravation therapy between the two groups. Inverse probability of treatment weighting was used to match the two groups for age, comorbidity index, American Urologic Association symptoms index, pretreatment PSA, grade group, and percent core involvement on prostate biopsy. Wilcoxon Rank Sum and Chi-squared tests were used to compare continuous and categorical variables. Of the patients who met the study’s eligibility criteria, 133 had a prostate MRI and 77 did not. Following propensity matching, there were no differences between baseline characteristics between the two groups. There were no statistically significant differences in treatments pursued between the two groups: 42% vs 47% were treated with brachytherapy alone, 40% vs 42% were treated with external beam radiotherapy alone, 18% vs 12% were treated with external beam radiotherapy with a brachytherapy boost, and 24% vs 17% received androgen deprivation therapy in the non-MRI and MRI groups, respectively. This analysis suggests that pretreatment MRI does not significantly impact radiation therapy or androgen deprivation therapy decisions in patients with intermediate-risk prostate cancer. Obtaining a pretreatment prostate MRI should be used judiciously and pursued only to answer a specific question, for which the answer is likely to impact treatment decision. Further follow up is needed to correlate MRI findings with their impacts on specific oncologic outcomes.

Keywords: magnetic resonance imaging, prostate cancer, definitive radiotherapy, gleason score 7

Procedia PDF Downloads 89
24610 Design and Implementation of Security Middleware for Data Warehouse Signature, Framework

Authors: Mayada Al Meghari

Abstract:

Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature, DWS Framework. The aim of using the middleware in our DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.

Keywords: middleware, parallel computing, data warehouse, security, group-key, high performance

Procedia PDF Downloads 119
24609 Economic Forecasting Analysis for Solar Photovoltaic Application

Authors: Enas R. Shouman

Abstract:

Economic development with population growth is leading to a continuous increase in energy demand. At the same time, growing global concern for the environment is driving to decrease the use of conventional energy sources and to increase the use of renewable energy sources. The objective of this study is to present the market trends of solar energy photovoltaic technology over the world and to represent economics methods for PV financial analyzes on the basis of expectations for the expansion of PV in many applications. In the course of this study, detailed information about the current PV market was gathered and analyzed to find factors influencing the penetration of PV energy. The paper methodology depended on five relevant economic financial analysis methods that are often used for investment decisions maker. These methods are payback analysis, net benefit analysis, saving-to-investment ratio, adjusted internal rate of return, and life-cycle cost. The results of this study may be considered as a marketing guide that helps diffusion of using PV Energy. The study showed that PV cost is economically reliable. The consumers will pay higher purchase prices for PV system installation but will get lower electricity bill.

Keywords: photovoltaic, financial methods, solar energy, economics, PV panel

Procedia PDF Downloads 109
24608 Air Dispersion Modeling for Prediction of Accidental Emission in the Atmosphere along Northern Coast of Egypt

Authors: Moustafa Osman

Abstract:

Modeling of air pollutants from the accidental release is performed for quantifying the impact of industrial facilities into the ambient air. The mathematical methods are requiring for the prediction of the accidental scenario in probability of failure-safe mode and analysis consequences to quantify the environmental damage upon human health. The initial statement of mitigation plan is supporting implementation during production and maintenance periods. In a number of mathematical methods, the flow rate at which gaseous and liquid pollutants might be accidentally released is determined from various types in term of point, line and area sources. These emissions are integrated meteorological conditions in simplified stability parameters to compare dispersion coefficients from non-continuous air pollution plumes. The differences are reflected in concentrations levels and greenhouse effect to transport the parcel load in both urban and rural areas. This research reveals that the elevation effect nearby buildings with other structure is higher 5 times more than open terrains. These results are agreed with Sutton suggestion for dispersion coefficients in different stability classes.

Keywords: air pollutants, dispersion modeling, GIS, health effect, urban planning

Procedia PDF Downloads 374
24607 Occupational Cumulative Effective Doses of Radiation Workers in Hamad Medical Corporation in Qatar

Authors: Omar Bobes, Abeer Al-Attar, Mohammad Hassan Kharita, Huda Al-Naemi

Abstract:

The number of radiological examinations has increased steadily in recent years. As a result, the risk of possible radiation-induced consequential damage also increases through continuous, lifelong, and increasing exposure to ionizing radiation. Therefore, radiation dose monitoring in medicine became an essential element of medical practice. In this study, the occupational cumulative doses for radiation workers in Hamad medical corporation in Qatar have been assessed for a period of five years. The number of monitored workers selected for this study was 555 (out of a total of 1250 monitored workers) who have been working continuously -with no interruption- with ionizing radiation over the past five years from 2015 to 2019. The aim of this work is to examine the occupational groups and the activities where the higher radiation exposure occurred and in what order of magnitude. The most exposed group was the nuclear medicine technologist staff, with an average cumulative dose of 8.4 mSv. The highest individual cumulative dose was 9.8 mSv recorded for the PET-CT technologist category.

Keywords: cumulative dose, effective dose, monitoring, occupational exposure, dosimetry

Procedia PDF Downloads 243