Search results for: export trade data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25190

Search results for: export trade data

24260 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.

Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)

Procedia PDF Downloads 62
24259 Mapping Tunnelling Parameters for Global Optimization in Big Data via Dye Laser Simulation

Authors: Sahil Imtiyaz

Abstract:

One of the biggest challenges has emerged from the ever-expanding, dynamic, and instantaneously changing space-Big Data; and to find a data point and inherit wisdom to this space is a hard task. In this paper, we reduce the space of big data in Hamiltonian formalism that is in concordance with Ising Model. For this formulation, we simulate the system using dye laser in FORTRAN and analyse the dynamics of the data point in energy well of rhodium atom. After mapping the photon intensity and pulse width with energy and potential we concluded that as we increase the energy there is also increase in probability of tunnelling up to some point and then it starts decreasing and then shows a randomizing behaviour. It is due to decoherence with the environment and hence there is a loss of ‘quantumness’. This interprets the efficiency parameter and the extent of quantum evolution. The results are strongly encouraging in favour of the use of ‘Topological Property’ as a source of information instead of the qubit.

Keywords: big data, optimization, quantum evolution, hamiltonian, dye laser, fermionic computations

Procedia PDF Downloads 181
24258 Studies on the Recovery of Calcium and Magnesium from Red Seawater by Nanofiltration Membrane

Authors: Mohamed H. Sorour, Hayam F. Shaalan, Heba A. Hani, Mahmoud A. El-Toukhy

Abstract:

This paper reports the results of nanofiltration (NF) polymeric membrane for the recovery of divalent ions (calcium and magnesium) from Red Seawater. Pilot plant experiments have been carried out using Alfa-Laval (NF 2517/48) membrane module. System was operated in both total recirculation mode (permeate and brine) and brine recirculation mode under hydraulic pressure of 15 bar. Impacts of some chelating agents on both flux and rejection have been also investigated. Results indicated that pure water permeability ranges from 17 to 85.5 L/m²h at 2-15 bar. Comparison with seawater permeability under the same operating pressure values reveals lower values of 8.9-31 L/m²h manifesting the effect of the osmotic pressure of seawater. Overall total dissolved solids (TDS) reduction was almost constant without incorporation of chelating agents. On the contrary of expectations, the use of chelating agents N-(2-hydroxyethyl) ethylene diamine-N,N´,N´-triacetic acid (HEDTA) and ethylene glycol bis (2-aminoethyl ether)-N,N,N´,N´-tetraacetic acid (EGTA) showed flux decline of about 3-15%. Analysis of rejection data of total recirculation mode showed reasonable rejection values of 35%, 59% and 90% for Ca, Mg and SO₄, respectively. Operating under brine recirculation mode only showed a decrease of rejection to 33%, 56% and 86% for Ca, Mg and SO₄, respectively. The use of chelating agents has no substantial effect on NF membrane performance except for increasing the total Ca rejection to 48 and 65% for EGTA and HEDTA, respectively. Results, in general, confirmed the powerful separation of NF technology for softening and recovery of divalent ions from seawater. It is anticipated that increasing operating pressure beyond the limits of our investigations would improve the rejection and flux values. A trade-off should be considered between operating cost (due to higher pressure and marginal benefits as manifested by expected improved performance). The experimental results fit well with the formulated rejection empirical correlations and the published ones.

Keywords: nanofiltration, seawater, recovery, calcium, magnesium

Procedia PDF Downloads 148
24257 Applying Different Stenography Techniques in Cloud Computing Technology to Improve Cloud Data Privacy and Security Issues

Authors: Muhammad Muhammad Suleiman

Abstract:

Cloud Computing is a versatile concept that refers to a service that allows users to outsource their data without having to worry about local storage issues. However, the most pressing issues to be addressed are maintaining a secure and reliable data repository rather than relying on untrustworthy service providers. In this study, we look at how stenography approaches and collaboration with Digital Watermarking can greatly improve the system's effectiveness and data security when used for Cloud Computing. The main requirement of such frameworks, where data is transferred or exchanged between servers and users, is safe data management in cloud environments. Steganography is the cloud is among the most effective methods for safe communication. Steganography is a method of writing coded messages in such a way that only the sender and recipient can safely interpret and display the information hidden in the communication channel. This study presents a new text steganography method for hiding a loaded hidden English text file in a cover English text file to ensure data protection in cloud computing. Data protection, data hiding capability, and time were all improved using the proposed technique.

Keywords: cloud computing, steganography, information hiding, cloud storage, security

Procedia PDF Downloads 172
24256 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 411
24255 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining

Authors: İbrahi̇m Kara, Seher Arslankaya

Abstract:

Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.

Keywords: data mining, decision support systems, heart attack, health sector

Procedia PDF Downloads 342
24254 Optimization of Waqf Land through Sukuk Al-Intifa’ to Build MSMEs in Indonesia

Authors: Khadijah Hasim, Achmad Fauzan Firdaus, Choirunnisa

Abstract:

Waqf land which previously was idle assets can be built on top of a building that is a means for people to conduct business. Nadzir (waqf managers) lease of waqf lands it manages, the agreed rental fee, which is payable in the form of the building, not in cash. After standing building, the developer will lease to interested companies. Given the magnitude of the beginning funds needed, The company later issuing sukuk al-intifa on the trading floor. With this sukuk issuance, the company has sufficient capital to begin operations and pay obligations of the rental fee to the developer each year. Building that had stood trade area will be established (Micro, Small, Middle Entreprises) MSMEs. It is expected that through the sukuk al-intifa, can help to make waqf land previously unproductive due to lack of capital to be very beneficial and help awaken the people of Indonesian MSMEs

Keywords: Sukuk Al-Intifa, MSMEs, waqf land, underlying asset

Procedia PDF Downloads 453
24253 The Growth Role of Natural Gas Consumption for Developing Countries

Authors: Tae Young Jin, Jin Soo Kim

Abstract:

Carbon emissions have emerged as global concerns. Intergovernmental Panel of Climate Change (IPCC) have published reports about Green House Gases (GHGs) emissions regularly. United Nations Framework Convention on Climate Change (UNFCCC) have held a conference yearly since 1995. Especially, COP21 held at December 2015 made the Paris agreement which have strong binding force differently from former COP. The Paris agreement was ratified as of 4 November 2016, they finally have legal binding. Participating countries set up their own Intended Nationally Determined Contributions (INDC), and will try to achieve this. Thus, carbon emissions must be reduced. The energy sector is one of most responsible for carbon emissions and fossil fuels particularly are. Thus, this paper attempted to examine the relationship between natural gas consumption and economic growth. To achieve this, we adopted the Cobb-Douglas production function that consists of natural gas consumption, economic growth, capital, and labor using dependent panel analysis. Data were preprocessed with Principal Component Analysis (PCA) to remove cross-sectional dependency which can disturb the panel results. After confirming the existence of time-trended component of each variable, we moved to cointegration test considering cross-sectional dependency and structural breaks to describe more realistic behavior of volatile international indicators. The cointegration test result indicates that there is long-run equilibrium relationship between selected variables. Long-run cointegrating vector and Granger causality test results show that while natural gas consumption can contribute economic growth in the short-run, adversely affect in the long-run. From these results, we made following policy implications. Since natural gas has positive economic effect in only short-run, the policy makers in developing countries must consider the gradual switching of major energy source, from natural gas to sustainable energy source. Second, the technology transfer and financing business suggested by COP must be accelerated. Acknowledgement—This work was supported by the Energy Efficiency & Resources Core Technology Program of the Korea Institute of Energy Technology Evaluation and Planning (KETEP) granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea (No. 20152510101880) and by the National Research Foundation of Korea Grant funded by the Korean Government (NRF-205S1A3A2046684).

Keywords: developing countries, economic growth, natural gas consumption, panel data analysis

Procedia PDF Downloads 215
24252 Bayesian Borrowing Methods for Count Data: Analysis of Incontinence Episodes in Patients with Overactive Bladder

Authors: Akalu Banbeta, Emmanuel Lesaffre, Reynaldo Martina, Joost Van Rosmalen

Abstract:

Including data from previous studies (historical data) in the analysis of the current study may reduce the sample size requirement and/or increase the power of analysis. The most common example is incorporating historical control data in the analysis of a current clinical trial. However, this only applies when the historical control dataare similar enough to the current control data. Recently, several Bayesian approaches for incorporating historical data have been proposed, such as the meta-analytic-predictive (MAP) prior and the modified power prior (MPP) both for single control as well as for multiple historical control arms. Here, we examine the performance of the MAP and the MPP approaches for the analysis of (over-dispersed) count data. To this end, we propose a computational method for the MPP approach for the Poisson and the negative binomial models. We conducted an extensive simulation study to assess the performance of Bayesian approaches. Additionally, we illustrate our approaches on an overactive bladder data set. For similar data across the control arms, the MPP approach outperformed the MAP approach with respect to thestatistical power. When the means across the control arms are different, the MPP yielded a slightly inflated type I error (TIE) rate, whereas the MAP did not. In contrast, when the dispersion parameters are different, the MAP gave an inflated TIE rate, whereas the MPP did not.We conclude that the MPP approach is more promising than the MAP approach for incorporating historical count data.

Keywords: count data, meta-analytic prior, negative binomial, poisson

Procedia PDF Downloads 104
24251 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis

Authors: John Gaber

Abstract:

Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.

Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)

Procedia PDF Downloads 472
24250 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network

Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang

Abstract:

As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.

Keywords: GUI, deep learning, GAN, data augmentation

Procedia PDF Downloads 166
24249 Modelling Rainfall-Induced Shallow Landslides in the Northern New South Wales

Authors: S. Ravindran, Y.Liu, I. Gratchev, D.Jeng

Abstract:

Rainfall-induced shallow landslides are more common in the northern New South Wales (NSW), Australia. From 2009 to 2017, around 105 rainfall-induced landslides occurred along the road corridors and caused temporary road closures in the northern NSW. Rainfall causing shallow landslides has different distributions of rainfall varying from uniform, normal, decreasing to increasing rainfall intensity. The duration of rainfall varied from one day to 18 days according to historical data. The objective of this research is to analyse slope instability of some of the sites in the northern NSW by varying cumulative rainfall using SLOPE/W and SEEP/W and compare with field data of rainfall causing shallow landslides. The rainfall data and topographical data from public authorities and soil data obtained from laboratory tests will be used for this modelling. There is a likelihood of shallow landslides if the cumulative rainfall is between 100 mm to 400 mm in accordance with field data.

Keywords: landslides, modelling, rainfall, suction

Procedia PDF Downloads 150
24248 Machine Learning-Enabled Classification of Climbing Using Small Data

Authors: Nicholas Milburn, Yu Liang, Dalei Wu

Abstract:

Athlete performance scoring within the climbing do-main presents interesting challenges as the sport does not have an objective way to assign skill. Assessing skill levels within any sport is valuable as it can be used to mark progress while training, and it can help an athlete choose appropriate climbs to attempt. Machine learning-based methods are popular for complex problems like this. The dataset available was composed of dynamic force data recorded during climbing; however, this dataset came with challenges such as data scarcity, imbalance, and it was temporally heterogeneous. Investigated solutions to these challenges include data augmentation, temporal normalization, conversion of time series to the spectral domain, and cross validation strategies. The investigated solutions to the classification problem included light weight machine classifiers KNN and SVM as well as the deep learning with CNN. The best performing model had an 80% accuracy. In conclusion, there seems to be enough information within climbing force data to accurately categorize climbers by skill.

Keywords: classification, climbing, data imbalance, data scarcity, machine learning, time sequence

Procedia PDF Downloads 129
24247 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 133
24246 Learning Analytics in a HiFlex Learning Environment

Authors: Matthew Montebello

Abstract:

Student engagement within a virtual learning environment generates masses of data points that can significantly contribute to the learning analytics that lead to decision support. Ideally, similar data is collected during student interaction with a physical learning space, and as a consequence, data is present at a large scale, even in relatively small classes. In this paper, we report of such an occurrence during classes held in a HiFlex modality as we investigate the advantages of adopting such a methodology. We plan to take full advantage of the learner-generated data in an attempt to further enhance the effectiveness of the adopted learning environment. This could shed crucial light on operating modalities that higher education institutions around the world will switch to in a post-COVID era.

Keywords: HiFlex, big data in higher education, learning analytics, virtual learning environment

Procedia PDF Downloads 182
24245 Li-Fi Technology: Data Transmission through Visible Light

Authors: Shahzad Hassan, Kamran Saeed

Abstract:

People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.

Keywords: communication, LED, Li-Fi, Wi-Fi

Procedia PDF Downloads 327
24244 Increase Productivity by Using Work Measurement Technique

Authors: Mohammed Al Awadh

Abstract:

In order for businesses to take advantage of the opportunities for expanded production and trade that have arisen as a result of globalization and increased levels of competition, productivity growth is required. The number of available sources is decreasing with each passing day, which results in an ever-increasing demand. In response to this, there will be an increased demand placed on firms to improve the efficiency with which they utilise their resources. As a scientific method, work and time research techniques have been employed in all manufacturing and service industries to raise the efficiency of use of the factors of production. These approaches focus on work and time. The goal of this research is to improve the productivity of a manufacturing industry's production system by looking at ways to measure work. The work cycles were broken down into more manageable and quantifiable components. On the observation sheet, these aspects were noted down. The operation has been properly analysed in order to identify value-added and non-value-added components, and observations have been recorded for each of the different trails.

Keywords: time study, work measurement, work study, efficiency

Procedia PDF Downloads 61
24243 An Analysis of Humanitarian Data Management of Polish Non-Governmental Organizations in Ukraine Since February 2022 and Its Relevance for Ukrainian Humanitarian Data Ecosystem

Authors: Renata Kurpiewska-Korbut

Abstract:

Making an assumption that the use and sharing of data generated in humanitarian action constitute a core function of humanitarian organizations, the paper analyzes the position of the largest Polish humanitarian non-governmental organizations in the humanitarian data ecosystem in Ukraine and their approach to non-personal and personal data management since February of 2022. Both expert interviews and document analysis of non-profit organizations providing a direct response in the Ukrainian crisis context, i.e., the Polish Humanitarian Action, Caritas, Polish Medical Mission, Polish Red Cross, and the Polish Center for International Aid and the applicability of theoretical perspective of contingency theory – with its central point that the context or specific set of conditions determining the way of behavior and the choice of methods of action – help to examine the significance of data complexity and adaptive approach to data management by relief organizations in the humanitarian supply chain network. The purpose of this study is to determine how the existence of well-established and accurate internal procedures and good practices of using and sharing data (including safeguards for sensitive data) by the surveyed organizations with comparable human and technological capabilities are implemented and adjusted to Ukrainian humanitarian settings and data infrastructure. The study also poses a fundamental question of whether this crisis experience will have a determining effect on their future performance. The obtained finding indicate that Polish humanitarian organizations in Ukraine, which have their own unique code of conduct and effective managerial data practices determined by contingencies, have limited influence on improving the situational awareness of other assistance providers in the data ecosystem despite their attempts to undertake interagency work in the area of data sharing.

Keywords: humanitarian data ecosystem, humanitarian data management, polish NGOs, Ukraine

Procedia PDF Downloads 77
24242 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases

Authors: Daniel C. Bonzo

Abstract:

Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.

Keywords: clustered data, estimand, extrapolation, mixed model

Procedia PDF Downloads 122
24241 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System

Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu

Abstract:

Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.

Keywords: communication, GEO satellite, data relay system, coverage

Procedia PDF Downloads 420
24240 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product

Authors: Tanawat Hongthai, Dusit Thanapatay

Abstract:

This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.

Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC

Procedia PDF Downloads 263
24239 Data Hiding by Vector Quantization in Color Image

Authors: Yung Gi Wu

Abstract:

With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.

Keywords: data hiding, vector quantization, watermark, color image

Procedia PDF Downloads 345
24238 Mechanical Properties of Kenaf Fibre Reinforced Epoxy Composites

Authors: C. Tezara, H. Y. Lim, M. H. Yazdi, J. W. Lim, J. P. Siregar

Abstract:

Natural fibre has become an element in human lives. A lot of researchers have conducted research about natural fibre reinforced polymer. Malaysian government has spent a lot of money on the research funding for researchers and academician especially research on kenaf fibre due to exclusion of tobacco from AFTA (Asean Free Trade Area) list. This work is to investigate the mechanical properties of kenaf fiber reinforced epoxy composite where short kenaf fibre was applied and the mechanical properties of 5%, 10% and 15% wt. of kenaf fibre were added into the mixture of epoxy resin. Hand lay-up process was selected in the fabrication of the specimen for testing. The tensile, flexural and impact test were conducted following ASTM D3039, ASTM D790 and ASTM D256 accordingly. From the experiment result, the effect of different fiber loading of the specimen on its mechanical properties would be analyzed and compared in the result and discussion.

Keywords: Kenaf fibre, epoxy, composite, fibre

Procedia PDF Downloads 268
24237 Impact of Weather Conditions on Generalized Frequency Division Multiplexing over Gamma Gamma Channel

Authors: Muhammad Sameer Ahmed, Piotr Remlein, Tansal Gucluoglu

Abstract:

The technique called as Generalized frequency division multiplexing (GFDM) used in the free space optical channel can be a good option for implementation free space optical communication systems. This technique has several strengths e.g. good spectral efficiency, low peak-to-average power ratio (PAPR), adaptability and low co-channel interference. In this paper, the impact of weather conditions such as haze, rain and fog on GFDM over the gamma-gamma channel model is discussed. A Trade off between link distance and system performance under intense weather conditions is also analysed. The symbol error probability (SEP) of GFDM over the gamma-gamma turbulence channel is derived and verified with the computer simulations.

Keywords: free space optics, generalized frequency division multiplexing, weather conditions, gamma gamma distribution

Procedia PDF Downloads 156
24236 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 175
24235 Iron Response Element-mRNA Binding to Iron Response Protein: Metal Ion Sensing

Authors: Mateen A. Khan, Elizabeth J. Theil, Dixie J. Goss

Abstract:

Cellular iron homeostasis is accomplished by the coordinated regulated expression of iron uptake, storage, and export. Iron regulate the translation of ferritin and mitochondrial aconitase iron responsive element (IRE)-mRNA by interaction with an iron regulatory protein (IRPs). Iron increases protein biosynthesis encoded in iron responsive element. The noncoding structure IRE-mRNA, approximately 30-nt, folds into a stem loop to control synthesis of proteins in iron trafficking, cell cycling, and nervous system function. Fluorescence anisotropy measurements showed the presence of one binding site on IRP1 for ferritin and mitochondrial aconitase IRE-mRNA. Scatchard analysis revealed the binding affinity (Kₐ) and average binding sites (n) for ferritin and mitochondrial aconitase IRE-mRNA were 68.7 x 10⁶ M⁻¹ and 9.2 x 10⁶ M⁻¹, respectively. In order to understand the relative importance of equilibrium and stability, we further report the contribution of electrostatic interactions in the overall binding of two IRE-mRNA with IRP1. The fluorescence quenching of IRP1 protein was measured at different ionic strengths. The binding affinity of IRE-mRNA to IRP1 decreases with increasing ionic strength, but the number of binding sites was independent of ionic strength. Such results indicate a differential contribution of electrostatics to the interaction of IRE-mRNA with IRP1, possibly related to helix bending or stem interactions and an overall conformational change. Selective destabilization of ferritin and mitochondrial aconitase RNA/protein complexes as reported here explain in part the quantitative differences in signal response to iron in vivo and indicate possible new regulatory interactions.

Keywords: IRE-mRNA, IRP1, binding, ionic strength

Procedia PDF Downloads 114
24234 Quality Approaches for Mass-Produced Fashion: A Study in Malaysian Garment Manufacturing

Authors: N. J. M. Yusof, T. Sabir, J. McLoughlin

Abstract:

Garment manufacturing industry involves sequential processes that are subjected to uncontrollable variations. The industry depends on the skill of labour in handling the varieties of fabrics and accessories, machines, and also a complicated sewing operation. Due to these reasons, garment manufacturers created systems to monitor and control the product’s quality regularly by conducting quality approaches to minimize variation. The aims of this research were to ascertain the quality approaches deployed by Malaysian garment manufacturers in three key areas-quality systems and tools; quality control and types of inspection; sampling procedures chosen for garment inspection. The focus of this research also aimed to distinguish quality approaches used by companies that supplied the finished garments to both domestic and international markets. The feedback from each of company’s representatives was obtained using the online survey, which comprised of five sections and 44 questions on the organizational profile and quality approaches used in the garment industry. The results revealed that almost all companies had established their own mechanism of process control by conducting a series of quality inspection for daily production either it was formally been set up or vice versa. Quality inspection was the predominant quality control activity in the garment manufacturing and the level of complexity of these activities was substantially dictated by the customers. AQL-based sampling was utilized by companies dealing with the export market, whilst almost all the companies that only concentrated on the domestic market were comfortable using their own sampling procedures for garment inspection. This research provides an insight into the implementation of quality approaches that were perceived as important and useful in the garment manufacturing sector, which is truly labour-intensive.

Keywords: garment manufacturing, quality approaches, quality control, inspection, Acceptance Quality Limit (AQL), sampling

Procedia PDF Downloads 423
24233 The Role of Information and Communication Technology in Achieving Competitive Advantage

Authors: Malki Fatima Zahra Nadia, Kellal Chaimaa, Brahimi Houria

Abstract:

The world has undergone a big dramatic transformations as a result of the liberalization of the economy. Which lead to intensity of competition between economic institutions under the slogan “ Survival of the fittest”.in the line with these changes, it is imperative for Organizations to adopt the philosophy of applying ICT in the era of globalization in order to survive and sustain in the local and international markets.and even the Algerian economic institutions are concerned in what witnessed by the international institutions,especially after Algeria adopted the policy of trade openness.And from this point, it was the start of the study, that aims at identifying the role of ICT in achieving competitive advantage in the economic institutions according to an analytical study of Mobilis Telecom in Algeria city, and then the analysis of the results by SPSS edition 24.To sum up, i have come to the conclusion that ICT has effectively contributed to the achievement of competitive advantage, and that the value of organizations today lies in the extent to which they use ICT’s which make it gain speed, efficiency, and the quality of its operations especially in the competitive competition.

Keywords: ICT, company, competitive advantage, competitive strategie

Procedia PDF Downloads 40
24232 The Role of Non-Governmental Organizations in Combating Human Trafficking in South India: An Overview

Authors: Kumudini Achchi

Abstract:

India, being known for its rich cultural values has given a special place to women who are also been victims of humiliation, torture, and exploitation. The major share of Human Trafficking goes to sex trafficking which is recognised as world’s second most huge social evil. The original form of sex trafficking in India is prostitution with and without religious sanction. Today the situation of such women reached as an issue of human rights where they rights are denied severely. This situation demanded intervention to protect them from the exploitative situation. NGO are the proactive initiatives which offer support to the exploited women in sex trade. To understand the intervention programs of NGOs in South India, a study was conducted covering four states and a union territory considering 32 NGOs based on their preparedness to participate in the research study. Descriptive and diagnostic research design was adopted along with interview schedule as a tool for collecting data. The study reveals that these NGOs believes in the possibility of mainstreaming commercially sexually exploited women and found adopted seven different programs in the process such as rescue, rehabilitation, reintegration, prevention, developmental, advocacy and research. Each area involves different programs to reach and prepare the exploited women towards mainstreamed society which has been discussed in the paper. Implementation of these programs is not an easy task for the organizations rather they are facing hardships in the areas such as social, legal, financial, political which are hindering the successful operations. Rescue, advocacy, and research are the least adopted areas by the NGOs because of lack of support as well as knowledge in the area. Rehabilitation stands as the most adopted area in implementation. The paper further deals with the challenges in the implementation of the programs as well as the remedial measures in social work point of view having Indian cultural background.

Keywords: NGOs, commercially sexually exploited women, programmes, South India

Procedia PDF Downloads 239
24231 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 296