Search results for: digital platforms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3526

Search results for: digital platforms

1546 Fake News Detection Based on Fusion of Domain Knowledge and Expert Knowledge

Authors: Yulan Wu

Abstract:

The spread of fake news on social media has posed significant societal harm to the public and the nation, with its threats spanning various domains, including politics, economics, health, and more. News on social media often covers multiple domains, and existing models studied by researchers and relevant organizations often perform well on datasets from a single domain. However, when these methods are applied to social platforms with news spanning multiple domains, their performance significantly deteriorates. Existing research has attempted to enhance the detection performance of multi-domain datasets by adding single-domain labels to the data. However, these methods overlook the fact that a news article typically belongs to multiple domains, leading to the loss of domain knowledge information contained within the news text. To address this issue, research has found that news records in different domains often use different vocabularies to describe their content. In this paper, we propose a fake news detection framework that combines domain knowledge and expert knowledge. Firstly, it utilizes an unsupervised domain discovery module to generate a low-dimensional vector for each news article, representing domain embeddings, which can retain multi-domain knowledge of the news content. Then, a feature extraction module uses the domain embeddings discovered through unsupervised domain knowledge to guide multiple experts in extracting news knowledge for the total feature representation. Finally, a classifier is used to determine whether the news is fake or not. Experiments show that this approach can improve multi-domain fake news detection performance while reducing the cost of manually labeling domain labels.

Keywords: fake news, deep learning, natural language processing, multiple domains

Procedia PDF Downloads 75
1545 Truthful or Untruthful Social Media Posts: Applying Statement Analysis to Decode online Deception

Authors: Christa L. Arnold, Margaret C. Stewart

Abstract:

This research shares the results of an exploratory study examining Statement Analysis (SA) to detect deception in online truthful and untruthful social media posts. Applying a Law Enforcement methodology SA, used in criminal interview statements, this research analyzes what is stated to assist in evaluating written deceptive information. Preliminary findings reveal qualitative and quantitative nuances for SA in online deception detection and uncover insights regarding digital deceptive behavior. Thus far, findings reveal truthful statements tend to differ from untruthful statements in both content and quality.

Keywords: deception detection, online deception, social media content, statement analysis

Procedia PDF Downloads 67
1544 Sampled-Data Control for Fuel Cell Systems

Authors: H. Y. Jung, Ju H. Park, S. M. Lee

Abstract:

A sampled-data controller is presented for solid oxide fuel cell systems which is expressed by a sector bounded nonlinear model. The sector bounded nonlinear systems, which have a feedback connection with a linear dynamical system and nonlinearity satisfying certain sector type constraints. Also, the sampled-data control scheme is very useful since it is possible to handle digital controller and increasing research efforts have been devoted to sampled-data control systems with the development of modern high-speed computers. The proposed control law is obtained by solving a convex problem satisfying several linear matrix inequalities. Simulation results are given to show the effectiveness of the proposed design method.

Keywords: sampled-data control, fuel cell, linear matrix inequalities, nonlinear control

Procedia PDF Downloads 567
1543 Author Profiling: Prediction of Learners’ Gender on a MOOC Platform Based on Learners’ Comments

Authors: Tahani Aljohani, Jialin Yu, Alexandra. I. Cristea

Abstract:

The more an educational system knows about a learner, the more personalised interaction it can provide, which leads to better learning. However, asking a learner directly is potentially disruptive, and often ignored by learners. Especially in the booming realm of MOOC Massive Online Learning platforms, only a very low percentage of users disclose demographic information about themselves. Thus, in this paper, we aim to predict learners’ demographic characteristics, by proposing an approach using linguistically motivated Deep Learning Architectures for Learner Profiling, particularly targeting gender prediction on a FutureLearn MOOC platform. Additionally, we tackle here the difficult problem of predicting the gender of learners based on their comments only – which are often available across MOOCs. The most common current approaches to text classification use the Long Short-Term Memory (LSTM) model, considering sentences as sequences. However, human language also has structures. In this research, rather than considering sentences as plain sequences, we hypothesise that higher semantic - and syntactic level sentence processing based on linguistics will render a richer representation. We thus evaluate, the traditional LSTM versus other bleeding edge models, which take into account syntactic structure, such as tree-structured LSTM, Stack-augmented Parser-Interpreter Neural Network (SPINN) and the Structure-Aware Tag Augmented model (SATA). Additionally, we explore using different word-level encoding functions. We have implemented these methods on Our MOOC dataset, which is the most performant one comparing with a public dataset on sentiment analysis that is further used as a cross-examining for the models' results.

Keywords: deep learning, data mining, gender predication, MOOCs

Procedia PDF Downloads 150
1542 The Impact of Social Media on Urban E-planning: A Review of the Literature

Authors: Farnoosh Faal

Abstract:

The rapid growth of social media has brought significant changes to the field of urban e-planning. This study aims to review the existing literature on the impact of social media on urban e-planning processes. The study begins with a discussion of the evolution of social media and its role in urban e-planning. The review covers research on the use of social media for public engagement, citizen participation, stakeholder communication, decision-making, and monitoring and evaluation of urban e-planning initiatives. The findings suggest that social media has the potential to enhance public participation and improve decision-making in urban e-planning processes. Social media platforms such as Facebook, Twitter, and Instagram can provide a platform for citizens to engage with planners and policymakers, express their opinions, and provide feedback on planning proposals. Social media can also facilitate the collection and analysis of data, including real-time data, to inform urban e-planning decision-making. However, the literature also highlights some challenges associated with the use of social media in urban e-planning. These challenges include issues related to the representativeness of social media users, the quality of information obtained from social media, the potential for bias and manipulation of social media content, and the need for effective data management and analysis. The study concludes with recommendations for future research on the use of social media in urban e-planning. The recommendations include the need for further research on the impact of social media on equity and social justice in planning processes, the need for more research on effective strategies for engaging underrepresented groups, and the development of guidelines for the use of social media in urban e-planning processes. Overall, the study suggests that social media has the potential to transform urban e-planning processes but that careful consideration of the opportunities and challenges associated with its use is essential for effective and ethical planning practice.

Keywords: social media, Urban e-planning, public participation, citizen engagement

Procedia PDF Downloads 238
1541 Assessing the Utility of Unmanned Aerial Vehicle-Borne Hyperspectral Image and Photogrammetry Derived 3D Data for Wetland Species Distribution Quick Mapping

Authors: Qiaosi Li, Frankie Kwan Kit Wong, Tung Fung

Abstract:

Lightweight unmanned aerial vehicle (UAV) loading with novel sensors offers a low cost approach for data acquisition in complex environment. This study established a framework for applying UAV system in complex environment quick mapping and assessed the performance of UAV-based hyperspectral image and digital surface model (DSM) derived from photogrammetric point clouds for 13 species classification in wetland area Mai Po Inner Deep Bay Ramsar Site, Hong Kong. The study area was part of shallow bay with flat terrain and the major species including reedbed and four mangroves: Kandelia obovata, Aegiceras corniculatum, Acrostichum auerum and Acanthus ilicifolius. Other species involved in various graminaceous plants, tarbor, shrub and invasive species Mikania micrantha. In particular, invasive species climbed up to the mangrove canopy caused damage and morphology change which might increase species distinguishing difficulty. Hyperspectral images were acquired by Headwall Nano sensor with spectral range from 400nm to 1000nm and 0.06m spatial resolution image. A sequence of multi-view RGB images was captured with 0.02m spatial resolution and 75% overlap. Hyperspectral image was corrected for radiative and geometric distortion while high resolution RGB images were matched to generate maximum dense point clouds. Furtherly, a 5 cm grid digital surface model (DSM) was derived from dense point clouds. Multiple feature reduction methods were compared to identify the efficient method and to explore the significant spectral bands in distinguishing different species. Examined methods including stepwise discriminant analysis (DA), support vector machine (SVM) and minimum noise fraction (MNF) transformation. Subsequently, spectral subsets composed of the first 20 most importance bands extracted by SVM, DA and MNF, and multi-source subsets adding extra DSM to 20 spectrum bands were served as input in maximum likelihood classifier (MLC) and SVM classifier to compare the classification result. Classification results showed that feature reduction methods from best to worst are MNF transformation, DA and SVM. MNF transformation accuracy was even higher than all bands input result. Selected bands frequently laid along the green peak, red edge and near infrared. Additionally, DA found that chlorophyll absorption red band and yellow band were also important for species classification. In terms of 3D data, DSM enhanced the discriminant capacity among low plants, arbor and mangrove. Meanwhile, DSM largely reduced misclassification due to the shadow effect and morphological variation of inter-species. In respect to classifier, nonparametric SVM outperformed than MLC for high dimension and multi-source data in this study. SVM classifier tended to produce higher overall accuracy and reduce scattered patches although it costs more time than MLC. The best result was obtained by combining MNF components and DSM in SVM classifier. This study offered a precision species distribution survey solution for inaccessible wetland area with low cost of time and labour. In addition, findings relevant to the positive effect of DSM as well as spectral feature identification indicated that the utility of UAV-borne hyperspectral and photogrammetry deriving 3D data is promising in further research on wetland species such as bio-parameters modelling and biological invasion monitoring.

Keywords: digital surface model (DSM), feature reduction, hyperspectral, photogrammetric point cloud, species mapping, unmanned aerial vehicle (UAV)

Procedia PDF Downloads 258
1540 Growth of New Media Advertising

Authors: Palwinder Bhatia

Abstract:

As all know new media is a broad term in media studies that emerged in the latter part of the 20th century which refers to on-demand access to content any time, anywhere, on any digital device, as well as interactive user feedback, creative participation and community formation around the media content. The role of new media in advertisement is impeccable these days. It becomes the cheap and best way of advertising. Another important promise of new media is the democratization of the creation, publishing, distribution and consumption of media content. New media brings a revolution in about every field. It makes bridge between customer and companies. World make a global village with the only help of new media. Advertising helps in shaping the consumer behavior and effect on consumer psychology, sociology, social anthropology and economics. People do comments and like the particular brands on the networking sites which create mesmerism impact on the behavior of customer. Recent study did by Times of India shows that 64% of Facebook users have liked a brand on Facebook.

Keywords: film, visual, culture, media, advertisement

Procedia PDF Downloads 284
1539 Power Quality Audit Using Fluke Analyzer

Authors: N. Ravikumar, S. Krishnan, B. Yokeshkumar

Abstract:

In present days, the power quality issues are increases due to non-linear loads like fridge, AC, washing machines, induction motor, etc. This power quality issues will affects the output voltages, output current, and output power of the total performance of the generator. This paper explains how to test the generator using the Fluke 435 II series power quality analyser. This Fluke 435 II series power quality analyser is used to measure the voltage, current, power, energy, total harmonic distortion (THD), current harmonics, voltage harmonics, power factor, and frequency. The Fluke 435 II series power quality analyser have several advantages. They are i) it will records output in analog and digital format. ii) the fluke analyzer will records at every 0.25 sec. iii) it will also measure all the electrical parameter at a time.

Keywords: THD, harmonics, power quality, TNEB, Fluke 435

Procedia PDF Downloads 178
1538 Secure Proxy Signature Based on Factoring and Discrete Logarithm

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

A digital signature is an electronic signature form used by an original signer to sign a specific document. When the original signer is not in his office or when he/she travels outside, he/she delegates his signing capability to a proxy signer and then the proxy signer generates a signing message on behalf of the original signer. The two parties must be able to authenticate one another and agree on a secret encryption key, in order to communicate securely over an unreliable public network. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties. In this paper, we present a secure proxy signature scheme over an efficient and secure authenticated key agreement protocol based on factoring and discrete logarithm problem.

Keywords: discrete logarithm, factoring, proxy signature, key agreement

Procedia PDF Downloads 312
1537 The Display of Environmental Information to Promote Energy Saving Practices: Evidence from a Massive Behavioral Platform

Authors: T. Lazzarini, M. Imbiki, P. E. Sutter, G. Borragan

Abstract:

While several strategies, such as the development of more efficient appliances, the financing of insulation programs or the rolling out of smart meters represent promising tools to reduce future energy consumption, their implementation relies on people’s decisions-actions. Likewise, engaging with consumers to reshape their behavior has shown to be another important way to reduce energy usage. For these reasons, integrating the human factor in the energy transition has become a major objective for researchers and policymakers. Digital education programs based on tangible and gamified user interfaces have become a new tool with potential effects to reduce energy consumption4. The B2020 program, developed by the firm “Économie d’Énergie SAS”, proposes a digital platform to encourage pro-environmental behavior change among employees and citizens. The platform integrates 160 eco-behaviors to help saving energy and water and reducing waste and CO2 emissions. A total of 13,146 citizens have used the tool so far to declare the range of eco-behaviors they adopt in their daily lives. The present work seeks to build on this database to identify the potential impact of adopted energy-saving behaviors (n=62) to reduce the use of energy in buildings. To this end, behaviors were classified into three categories regarding the nature of its implementation (Eco-habits: e.g., turning-off the light, Eco-actions: e.g., installing low carbon technology such as led light-bulbs and Home-Refurbishments: e.g., such as wall-insulation or double-glazed energy efficient windows). General Linear Models (GLM) disclosed the existence of a significantly higher frequency of Eco-habits when compared to the number of home-refurbishments realized by the platform users. While this might be explained in part by the high financial costs that are associated with home renovation works, it also contrasts with the up to three times larger energy-savings that can be accomplished by these means. Furthermore, multiple regression models failed to disclose the expected relationship between energy-savings and frequency of adopted eco behaviors, suggesting that energy-related practices are not necessarily driven by the correspondent energy-savings. Finally, our results also suggested that people adopting more Eco-habits and Eco-actions were more likely to engage in Home-Refurbishments. Altogether, these results fit well with a growing body of scientific research, showing that energy-related practices do not necessarily maximize utility, as postulated by traditional economic models, and suggest that other variables might be triggering them. Promoting home refurbishments could benefit from the adoption of complementary energy-saving habits and actions.

Keywords: energy-saving behavior, human performance, behavioral change, energy efficiency

Procedia PDF Downloads 203
1536 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity

Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink

Abstract:

The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.

Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction

Procedia PDF Downloads 314
1535 A New Microstrip Diplexer Using Coupled Stepped Impedance Resonators

Authors: A. Chinig, J. Zbitou, A. Errkik, L. Elabdellaoui, A. Tajmouati, A. Tribak, M. Latrach

Abstract:

This paper presents a new structure of microstrip band pass filter (BPF) based on coupled stepped impedance resonators. Each filter consists of two coupled stepped impedance resonators connected to microstrip feed lines. The coupled junction is utilized to connect the two BPFs to the antenna. This two band pass filters are designed and simulated to operate for the digital communication system (DCS) and Industrial Scientific and Medical (ISM) bands at 1.8 GHz and 2.45 GHz respectively. The proposed circuit presents good performances with an insertion loss lower than 2.3 dB and isolation between the two channels greater than 21 dB. The prototype of the optimized diplexer have been investigated numerically by using ADS Agilent and verified with CST microwave software.

Keywords: band pass filter, coupled junction, coupled stepped impedance resonators, diplexer, insertion loss, isolation

Procedia PDF Downloads 434
1534 Sensor Registration in Multi-Static Sonar Fusion Detection

Authors: Longxiang Guo, Haoyan Hao, Xueli Sheng, Hanjun Yu, Jingwei Yin

Abstract:

In order to prevent target splitting and ensure the accuracy of fusion, system error registration is an important step in multi-static sonar fusion detection system. To eliminate the inherent system errors including distance error and angle error of each sonar in detection, this paper uses offline estimation method for error registration. Suppose several sonars from different platforms work together to detect a target. The target position detected by each sonar is based on each sonar’s own reference coordinate system. Based on the two-dimensional stereo projection method, this paper uses real-time quality control (RTQC) method and least squares (LS) method to estimate sensor biases. The RTQC method takes the average value of each sonar’s data as the observation value and the LS method makes the least square processing of each sonar’s data to get the observation value. In the underwater acoustic environment, matlab simulation is carried out and the simulation results show that both algorithms can estimate the distance and angle error of sonar system. The performance of the two algorithms is also compared through the root mean square error and the influence of measurement noise on registration accuracy is explored by simulation. The system error convergence of RTQC method is rapid, but the distribution of targets has a serious impact on its performance. LS method can not be affected by target distribution, but the increase of random noise will slow down the convergence rate. LS method is an improvement of RTQC method, which is widely used in two-dimensional registration. The improved method can be used for underwater multi-target detection registration.

Keywords: data fusion, multi-static sonar detection, offline estimation, sensor registration problem

Procedia PDF Downloads 170
1533 The Antecedents of Continued Usage on Social-Oriented Virtual Communities Based on Automaticity Mechanism

Authors: Hsiu-Hua Cheng

Abstract:

In recent years, the number of social-oriented virtual communities users has increased significantly. Corporate investment in advertising on social-oriented virtual communities increases quickly. With the gigantic commercial value of the digital market, competitions between virtual communities are keen. In this context, how to retain existing customers to continue using social-oriented virtual communities is an urgent issue for virtual community managers. This study employs the perspective of automaticity mechanism and combines the social embeddedness theory with the literature of involvement and habit in order to explore antecedents of users’ continuous usage on social-oriented virtual communities. The results can be a reference for scholars and managers of social-oriented virtual communities.

Keywords: continued usage, habit, social embeddedness, involvement, virtual community

Procedia PDF Downloads 426
1532 MONDO Neutron Tracker Characterisation by Means of Proton Therapeutical Beams and MonteCarlo Simulation Studies

Authors: G. Traini, V. Giacometti, R. Mirabelli, V. Patera, D. Pinci, A. Sarti, A. Sciubba, M. Marafini

Abstract:

The MONDO (MOnitor for Neutron Dose in hadrOntherapy) project aims a precise characterisation of the secondary fast and ultrafast neutrons produced in particle therapy treatments. The detector is composed of a matrix of scintillating fibres (250 um) readout by CMOS Digital-SPAD based sensors. Recoil protons from n-p elastic scattering are detected and used to track neutrons. A prototype was tested with proton beams (Trento Proton Therapy Centre): efficiency, light yield, and track-reconstruction capability were studied. The results of a MonteCarlo FLUKA simulation used to evaluated double scattering efficiency and expected backgrounds will be presented.

Keywords: secondary neutrons, particle therapy, tracking, elastic scattering

Procedia PDF Downloads 268
1531 Impact of the Fourth Industrial Revolution on Food Security in South Africa

Authors: Fiyinfoluwa Giwa, Nicholas Ngepah

Abstract:

This paper investigates the relationship between the Fourth Industrial Revolution and food security in South Africa. The Ordinary Least Square was adopted from 2012 Q1 to 2021 Q4. The study used artificial intelligence investment and the food production index as the measure for the fourth industrial revolution and food security, respectively. Findings reveal a significant and positive coefficient of 0.2887, signifying a robust statistical relationship between AI adoption and the food production index. As a policy recommendation, this paper recommends the introduction of incentives for farmers and agricultural enterprises to adopt AI technologies -and the expansion of digital connectivity and access to technology in rural areas.

Keywords: Fourth Industrial Revolution, food security, artificial intelligence investment, food production index, ordinary least square

Procedia PDF Downloads 76
1530 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development

Procedia PDF Downloads 245
1529 Building Knowledge Partnership for Collaborative Learning in Higher Education – An On-Line ‘Eplanete’ Knowledge Mediation Platform

Authors: S. K. Ashiquer Rahman

Abstract:

This paper presents a knowledge mediation platform, “ePLANETe Blue” that addresses the challenge of building knowledge partnerships for higher education. The purpose is to present, as an institutional perception, the ‘ePLANETe' idea and functionalities as a practical and pedagogical innovation program contributing to the collaborative learning goals in higher education. In consequence, the set of functionalities now amalgamated in ‘ePLANETe’ can be seen as an investigation of the challenges of “Collaborative Learning Digital Process.” It can exploit the system to facilitate collaborative education, research and student learning in higher education. Moreover, the platform is projected to support the identification of best practices at explicit levels of action and to inspire knowledge interactions in a “virtual community” and thus to advance in deliberation and learning evaluation of higher education through the engagement of collaborative activities of different sorts.

Keywords: mediation, collaboration, deliberation, evaluation

Procedia PDF Downloads 145
1528 Words Spotting in the Images Handwritten Historical Documents

Authors: Issam Ben Jami

Abstract:

Information retrieval in digital libraries is very important because most famous historical documents occupy a significant value. The word spotting in historical documents is a very difficult notion, because automatic recognition of such documents is naturally cursive, it represents a wide variability in the level scale and translation words in the same documents. We first present a system for the automatic recognition, based on the extraction of interest points words from the image model. The extraction phase of the key points is chosen from the representation of the image as a synthetic description of the shape recognition in a multidimensional space. As a result, we use advanced methods that can find and describe interesting points invariant to scale, rotation and lighting which are linked to local configurations of pixels. We test this approach on documents of the 15th century. Our experiments give important results.

Keywords: feature matching, historical documents, pattern recognition, word spotting

Procedia PDF Downloads 276
1527 Development of New Localized Surface Plasmon Resonance Interfaces Based on ITO Au NPs/ Polymer for Nickel Detection

Authors: F. Z. Tighilt, N. Belhaneche-Bensemra, S. Belhousse, S. Sam, K. Lasmi, N. Gabouze

Abstract:

Recently, the gold nanoparticles (Au NPs) became an active multidisciplinary research topic. First, Au thin films fabricated by alkylthiol-functionalized Au NPs were found to have vapor sensitive conductivities, they were hence widely investigated as electrical chemiresistors for sensing different vapor analytes and even organic molecules in aqueous solutions. Second, Au thin films were demonstrated to have speciallocalized surface plasmon resonances (LSPR), so that highly ordered 2D Au superlattices showed strong collective LSPR bands due to the near-field coupling of adjacent nanoparticles and were employed to detect biomolecular binding. Particularly when alkylthiol ligands were replaced by thiol-terminated polymers, the resulting polymer-modified Au NPs could be readily assembled into 2D nanostructures on solid substrates. Monolayers of polystyrene-coated Au NPs showed typical dipolar near-field interparticle plasmon coupling of LSPR. Such polymer-modified Au nanoparticle films have an advantage that the polymer thickness can be feasibly controlled by changing the polymer molecular weight. In this article, the effect of tin-doped indium oxide (ITO) coatings on the plasmonic properties of ITO interfaces modified with gold nanostructures (Au NSs) is investigated. The interest in developing ITO overlayers is multiple. The presence of a con-ducting ITO overlayer creates a LSPR-active interface, which can serve simultaneously as a working electrode in an electro-chemical setup. The surface of ITO/ Au NPs contains hydroxyl groups that can be used to link functional groups to the interface. Here the covalent linking of nickel /Au NSs/ITO hybrid LSPR platforms will be presented.

Keywords: conducting polymer, metal nanoparticles (NPs), LSPR, poly (3-(pyrrolyl)–carboxylic acid), polypyrrole

Procedia PDF Downloads 268
1526 High Capacity Reversible Watermarking through Interpolated Error Shifting

Authors: Hae-Yeoun Lee

Abstract:

Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.

Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation

Procedia PDF Downloads 325
1525 Strengthening of Reinforced Concrete Beam-Column Joint by Reversible Mixed Technologies of FRP

Authors: Nasser-Eddine Attari

Abstract:

After the earthquake many structures were classified as slightly damaged and, being uneconomic to replace them, at least in the short term, suitable means of repairs of the beam column joint area are being studied. Furthermore there exist a large number of buildings that need retrofitting of the joints before the next earthquake. The paper reports the results of the experimental programme, constituted of three beam-column reinforced concrete joints at a scale of one to three (1/3) tested under the effect of a pre-stressed axial load acting over the column. The beams were subjected at their ends to an alternate cyclic loading under displacement control to simulate a seismic action. Strain and cracking fields were monitored with the help a digital recording camera. Following the analysis of the results, a comparison can be made between the performances in terms of ductility, strength and mode of failure of the different strengthening solution considered.

Keywords: fibrereinforced polymers, joints, reinforced concrete, beam columns

Procedia PDF Downloads 502
1524 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 220
1523 Deployment of Matrix Transpose in Digital Image Encryption

Authors: Okike Benjamin, Garba E J. D.

Abstract:

Encryption is used to conceal information from prying eyes. Presently, information and data encryption are common due to the volume of data and information in transit across the globe on daily basis. Image encryption is yet to receive the attention of the researchers as deserved. In other words, video and multimedia documents are exposed to unauthorized accessors. The authors propose image encryption using matrix transpose. An algorithm that would allow image encryption is developed. In this proposed image encryption technique, the image to be encrypted is split into parts based on the image size. Each part is encrypted separately using matrix transpose. The actual encryption is on the picture elements (pixel) that make up the image. After encrypting each part of the image, the positions of the encrypted images are swapped before transmission of the image can take place. Swapping the positions of the images is carried out to make the encrypted image more robust for any cryptanalyst to decrypt.

Keywords: image encryption, matrices, pixel, matrix transpose

Procedia PDF Downloads 423
1522 Catalyzing Agricultural Technology Adoption: The Role of Socioeconomic and Institutional Factors in Developing Economies

Authors: Faiza Manzoor

Abstract:

The main purpose of this study is to examine the socioeconomic and institutional determinants that affect farmers' probability of agricultural technology adoption. Primary survey data was gathered from 350 small household farmers of developing economies, and the empirical analysis was completed by exploiting a logit and probit model. The outcomes of this research emphasize that socioeconomic factors such as farmers' age, education level, and farm size impact farmers' behavior in adopting digital farming technology. The results show that cooperative membership and institutional characteristics such as access to credit and extension services have also affected small farmers' use and adoption of agricultural technologies. Digitalization also helps farmers by increasing their understanding and improving their decision-making abilities. Some policy implications and future directions are discussed.

Keywords: agricultural technology adoption, socioeconomic, institutional, factors

Procedia PDF Downloads 4
1521 Investigating the Suitability of Utilizing Lyophilized Gels to Improve the Stability of Ufasomes

Authors: Mona Hassan Aburahma, Alaa Hamed Salama

Abstract:

Ufasomes “unsaturated fatty acids liposomes” are unique nano-sized self-assembled bilayered vesicles that can be easily created from the readily available unsaturated fatty acid. Ufasomes are formed due to weak associative interaction of the fully ionized and unionized fatty acids into bilayers structures. In the ufasomes constructs, the fatty acid molecules are oriented with their hydrocarbon tails directed toward the membrane interior and the carboxyl groups are in contact with water. Although ufasomes can be employed as a safe vesicular carrier for drugs, the extreme instability of their aqueous dispersions hinders their effective use in drug delivery field. Accordingly, in our study, lyophilized gels containing ufasomes were prepared using a simple assembling technique form the readily available oleic acid to overcome the colloidal instability of the ufasomes dispersions and convert them into accurate unit dosage forms. The influence of changing cholesterol percentage relative to oleic acid on the ufasomes vesicles were investigated using factorial design. The optimized oleic acid ufasomes comprised nanoscaled spherical vesicles. Scanning electron micrographs of the lyophilized gels revealed that the included ufasomes were intact, non-aggregating, and preserved their spherical morphology. Rheological characterization (viscosity and shear stress versus shear rate) of reconstituted ufasomal lyophilized gel ensured the ease of application. The capability of the ufasomes, included in the gel, to penetrate deep through the mucosa layers was illustrated using ex-vivo confocal laser imaging, thereby, highlighting the feasibility of stabilizing ufasomes using lyophilized gel platforms.

Keywords: ufasomes, lyophilized gel, confocal scanning microscopy, rheological characterization, oleic acid

Procedia PDF Downloads 409
1520 Digital Divide and Its Impact on the Students’ Performance

Authors: Aissa Hanifi

Abstract:

People across different world societies are using information and communication technology (ICT) for different purposes. Unfortunately, in contemporary societies, some people have little access to ICT and thus cannot have effective participation in society compared with those who have better access. The purpose of this study is to test the impact of ICTs on university life in general and students' performance in particular. The study relied on an online survey questionnaire that was administered to 30 undergraduate students at Chef University. The findings of the survey revealed that there is still an important number of students who do not have easy access to ICT. Such limited access to ICTs is attributed to varied factors. Some students live in rural areas, where due to the poor internet coverage, they face difficulties in competing with students who live in urban areas with better ICT access. The lack of ICT access has hindered the students' university performance in general and their language skills, and the exchange of information with teachers and classmates.

Keywords: access, communication, ICT, performance, technology

Procedia PDF Downloads 135
1519 An Integrated Framework for Engaging Stakeholders in the Circular Economy Processes Using Building Information Modeling and Virtual Reality

Authors: Erisasadat Sahebzamani, Núria Forcada, Francisco Lendinez

Abstract:

Global climate change has become increasingly problematic over the past few decades. The construction industry has contributed to greenhouse gas emissions in recent decades. Considering these issues and the high demand for materials in the construction industry, Circular Economy (CE) is considered necessary to keep materials in the loop and extend their useful lives. By providing tangible benefits, Construction 4.0 facilitates the adoption of CE by reducing waste, updating standard work, sharing knowledge, and increasing transparency and stability. This study aims to present a framework for integrating CE and digital tools like Building Information Modeling (BIM) and Virtual Reality (VR) to examine the impact on the construction industry based on stakeholders' perspectives.

Keywords: circular economy, building information modeling, virtual reality, stakeholder engagement

Procedia PDF Downloads 112
1518 Scintigraphic Image Coding of Region of Interest Based on SPIHT Algorithm Using Global Thresholding and Huffman Coding

Authors: A. Seddiki, M. Djebbouri, D. Guerchi

Abstract:

Medical imaging produces human body pictures in digital form. Since these imaging techniques produce prohibitive amounts of data, compression is necessary for storage and communication purposes. Many current compression schemes provide a very high compression rate but with considerable loss of quality. On the other hand, in some areas in medicine, it may be sufficient to maintain high image quality only in region of interest (ROI). This paper discusses a contribution to the lossless compression in the region of interest of Scintigraphic images based on SPIHT algorithm and global transform thresholding using Huffman coding.

Keywords: global thresholding transform, huffman coding, region of interest, SPIHT coding, scintigraphic images

Procedia PDF Downloads 369
1517 Robotics Education Continuity from Diaper Age to Doctorate

Authors: Vesa Salminen, Esa Santakallio, Heikki Ruohomaa

Abstract:

Introduction: The city of Riihimäki has decided robotics on well-being, service and industry as the main focus area on their ecosystem strategy. Robotics is going to be an important part of the everyday life of citizens and present in the working day of the average citizen and employee in the future. For that reason, also education system and education programs on all levels of education from diaper age to doctorate have been directed to fulfill this ecosystem strategy. Goal: The objective of this activity has been to develop education continuity from diaper age to doctorate. The main target of the development activity is to create a unique robotics study entity that enables ongoing robotics studies from preprimary education to university. The aim is also to attract students internationally and supply a skilled workforce to the private sector, capable of the challenges of the future. Methodology: Education instances (high school, second grade, Universities on all levels) in a large area of Tavastia Province have gradually directed their education programs to support this goal. On the other hand, applied research projects have been created to make proof of concept- phases on areal real environment field labs to test technology opportunities and digitalization to change business processes by applying robotic solutions. Customer-oriented applied research projects offer for students in robotics education learning environments to learn new knowledge and content. That is also a learning environment for education programs to adapt and co-evolution. New content and problem-based learning are used in future education modules. Major findings: Joint robotics education entity is being developed in cooperation with the city of Riihimäki (primary education), Syria Education (secondary education) and HAMK (bachelor and master education). The education modules have been developed to enable smooth transitioning from one institute to another. This article is introduced a case study of the change of education of wellbeing education because of digitalization and robotics. Riihimäki's Elderly citizen's service house, Riihikoti, has been working as a field lab for proof-of-concept phases on testing technology opportunities. According to successful case studies also education programs on various levels of education have been changing. Riihikoti has been developed as a physical learning environment for home care and robotics, investigating and developing a variety of digital devices and service opportunities and experimenting and learn the use of equipment. The environment enables the co-development of digital service capabilities in the authentic environment for all interested groups in transdisciplinary cooperation.

Keywords: ecosystem strategy, digitalization and robotics, education continuity, learning environment, transdisciplinary co-operation

Procedia PDF Downloads 179