Search results for: labeling complexity.
322 Simulation of Agri-Food Supply Chains
Authors: Sherine Beshara, Khaled S. El-Kilany, Noha M. Galal
Abstract:
Supply chain management has become more challenging with the emerging trend of globalization and sustainability. Lately, research related to perishable products supply chains, in particular agricultural food products, has emerged. This is attributed to the additional complexity of managing this type of supply chains with the recently increased concern of public health, food quality, food safety, demand and price variability, and the limited lifetime of these products. Inventory management for agrifood supply chains is of vital importance due to the product perishability and customers- strive for quality. This paper concentrates on developing a simulation model of a real life case study of a two echelon production-distribution system for agri-food products. The objective is to improve a set of performance measures by developing a simulation model that helps in evaluating and analysing the performance of these supply chains. Simulation results showed that it can help in improving overall system performance.Keywords: Agri-food supply chains, inventory model, modelling and Simulation, supply chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3359321 Phytoadaptation in Desert Soil Prediction Using Fuzzy Logic Modeling
Authors: S. Bouharati, F. Allag, M. Belmahdi, M. Bounechada
Abstract:
In terms of ecology forecast effects of desertification, the purpose of this study is to develop a predictive model of growth and adaptation of species in arid environment and bioclimatic conditions. The impact of climate change and the desertification phenomena is the result of combined effects in magnitude and frequency of these phenomena. Like the data involved in the phytopathogenic process and bacteria growth in arid soil occur in an uncertain environment because of their complexity, it becomes necessary to have a suitable methodology for the analysis of these variables. The basic principles of fuzzy logic those are perfectly suited to this process. As input variables, we consider the physical parameters, soil type, bacteria nature, and plant species concerned. The result output variable is the adaptability of the species expressed by the growth rate or extinction. As a conclusion, we prevent the possible strategies for adaptation, with or without shifting areas of plantation and nature adequate vegetation.
Keywords: Climate changes, dry soil, Phytopathogenicity, Predictive model, Fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875320 The Estimation of Human Vital Signs Complexity
Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius
Abstract:
Nonstationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based on the interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore, we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables’ interactions.
Keywords: Cardiac diseases, Complex systems theory, ECG analysis, matrix analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247319 Estimation of Skew Angle in Binary Document Images Using Hough Transform
Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar
Abstract:
This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3266318 Curriculum Based Measurement and Precision Teaching in Writing Empowerment Enhancement: Results from an Italian Learning Center
Authors: I. Pelizzoni, C. Cavallini, I. Salvaderi, F. Cavallini
Abstract:
We present the improvement in writing skills obtained by 94 participants (aged between six and 10 years) with special educational needs through a writing enhancement program based on fluency principles. The study was planned and conducted with a single-subject experimental plan for each of the participants, in order to confirm the results in the literature. These results were obtained using precision teaching (PT) methodology to increase the number of written graphemes per minute in the pre- and post-test, by curriculum based measurement (CBM). Results indicated an increase in the number of written graphemes for all participants. The average overall duration of the intervention is 144 minutes in five months of treatment. These considerations have been analyzed taking account of the complexity of the implementation of measurement systems in real operational contexts (an Italian learning center) and important aspects of replicability and cost-effectiveness of such interventions.
Keywords: Precision teaching, writing skills, CBM, Italian Learning Center.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784317 A Block Cipher for Resource-Constrained IoT Devices
Authors: Muhammad Rana, Quazi Mamun, Rafiqul Islam
Abstract:
In the Internet of Things (IoT), many devices are connected and accumulate a sheer amount of data. These Internet-driven raw data need to be transferred securely to the end-users via dependable networks. Consequently, the challenges of IoT security in various IoT domains are paramount. Cryptography is being applied to secure the networks for authentication, confidentiality, data integrity and access control. However, due to the resource constraint properties of IoT devices, the conventional cipher may not be suitable in all IoT networks. This paper designs a robust and effective lightweight cipher to secure the IoT environment and meet the resource-constrained nature of IoT devices. We also propose a symmetric and block-cipher based lightweight cryptographic algorithm. The proposed algorithm increases the complexity of the block cipher, maintaining the lowest computational requirements possible. The proposed algorithm efficiently constructs the key register updating technique, reduces the number of encryption rounds, and adds a layer between the encryption and decryption processes.
Keywords: Internet of Things, IoT, cryptography block cipher, s-box, key management, IoT security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539316 A New Quantile Based Fuzzy Time Series Forecasting Model
Authors: Tahseen A. Jilani, Aqil S. Burney, C. Ardil
Abstract:
Time series models have been used to make predictions of academic enrollments, weather, road accident, casualties and stock prices, etc. Based on the concepts of quartile regression models, we have developed a simple time variant quantile based fuzzy time series forecasting method. The proposed method bases the forecast using prediction of future trend of the data. In place of actual quantiles of the data at each point, we have converted the statistical concept into fuzzy concept by using fuzzy quantiles using fuzzy membership function ensemble. We have given a fuzzy metric to use the trend forecast and calculate the future value. The proposed model is applied for TAIFEX forecasting. It is shown that proposed method work best as compared to other models when compared with respect to model complexity and forecasting accuracy.
Keywords: Quantile Regression, Fuzzy time series, fuzzy logicalrelationship groups, heuristic trend prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997315 Unscented Transformation for Estimating the Lyapunov Exponents of Chaotic Time Series Corrupted by Random Noise
Authors: K. Kamalanand, P. Mannar Jawahar
Abstract:
Many systems in the natural world exhibit chaos or non-linear behavior, the complexity of which is so great that they appear to be random. Identification of chaos in experimental data is essential for characterizing the system and for analyzing the predictability of the data under analysis. The Lyapunov exponents provide a quantitative measure of the sensitivity to initial conditions and are the most useful dynamical diagnostic for chaotic systems. However, it is difficult to accurately estimate the Lyapunov exponents of chaotic signals which are corrupted by a random noise. In this work, a method for estimation of Lyapunov exponents from noisy time series using unscented transformation is proposed. The proposed methodology was validated using time series obtained from known chaotic maps. In this paper, the objective of the work, the proposed methodology and validation results are discussed in detail.
Keywords: Lyapunov exponents, unscented transformation, chaos theory, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988314 Approximate Bounded Knowledge Extraction Using Type-I Fuzzy Logic
Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil
Abstract:
Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Keywords: Crisp neural networks, fuzzy systems, extraction of logical rules, quasi-fuzzy numbers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740313 A Framework to Support Reuse in Object-Oriented Software Development
Authors: Fathi Taibi
Abstract:
Reusability is a quality desired attribute in software products. Generally, it could be achieved through adopting development methods that promote it and achieving software qualities that have been linked with high reusability proneness. With the exponential growth in mobile application development, software reuse became an integral part in a substantial number of projects. Similarly, software reuse has become widely practiced in start-up companies. However, this has led to new emerging problems. Firstly, the reused code does not meet the required quality and secondly, the reuse intentions are dubious. This work aims to propose a framework to support reuse in Object-Oriented (OO) software development. The framework comprises a process that uses a proposed reusability assessment metric and a formal foundation to specify the elements of the reused code and the relationships between them. The framework is empirically evaluated using a wide range of open-source projects and mobile applications. The results are analyzed to help understand the reusability proneness of OO software and the possible means to improve it.
Keywords: Software reusability, software metrics, object-oriented software, modularity, low complexity, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 379312 Active Power Flow Control Using A TCSC Based Backstepping Controller in Multimachine Power System
Authors: Naimi Abdelhamid, Othmane Abdelkhalek
Abstract:
With the current rise in the demand of electrical energy, present-day power systems which are large and complex, will continue to grow in both size and complexity. Flexible AC Transmission System (FACTS) controllers provide new facilities, both in steady state power flow control and dynamic stability control. Thyristor Controlled Series Capacitor (TCSC) is one of FACTS equipment, which is used for power flow control of active power in electric power system and for increase of capacities of transmission lines. In this paper, a Backstepping Power Flow Controller (BPFC) for TCSC in multimachine power system is developed and tested. The simulation results show that the TCSC proposed controller is capable of controlling the transmitted active power and improving the transient stability when compared with conventional PI Power Flow Controller (PIPFC).
Keywords: FACTS, Thyristor Controlled Series Capacitor (TCSC), Backstepping, BPFC, PIPFC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795311 A Calibration Approach towards Reducing ASM2d Parameter Subsets in Phosphorus Removal Processes
Authors: N.Boontian
Abstract:
A novel calibration approach that aims to reduce ASM2d parameter subsets and decrease the model complexity is presented. This approach does not require high computational demand and reduces the number of modeling parameters required to achieve the ASMs calibration by employing a sensitivity and iteration methodology. Parameter sensitivity is a crucial factor and the iteration methodology enables refinement of the simulation parameter values. When completing the iteration process, parameters values are determined in descending order of their sensitivities. The number of iterations required is equal to the number of model parameters of the parameter significance ranking. This approach was used for the ASM2d model to the evaluated EBPR phosphorus removal and it was successful. Results of the simulation provide calibration parameters. These included YPAO, YPO4, YPHA, qPHA, qPP, μPAO, bPAO, bPP, bPHA, KPS, YA, μAUT, bAUT, KO2 AUT, and KNH4 AUT. Those parameters were corresponding to the experimental data available.Keywords: ASM2d, calibration approach, iteration methodology, sensitivity, phosphorus removal
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2420310 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece
Authors: N. Samarinas, C. Evangelides, C. Vrekos
Abstract:
The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.
Keywords: Classification, fuzzy logic, tolerance relations, rainfall data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026309 Effect of Plasticizer Additives on the Mechanical Properties of Cement Composite – A Molecular Dynamics Analysis
Authors: R. Mohan, V. Jadhav, A. Ahmed, J. Rivas, A. Kelkar
Abstract:
Cementitious materials are an excellent example of a composite material with complex hierarchical features and random features that range from nanometer (nm) to millimeter (mm) scale. Multi-scale modeling of complex material systems requires starting from fundamental building blocks to capture the scale relevant features through associated computational models. In this paper, molecular dynamics (MD) modeling is employed to predict the effect of plasticizer additive on the mechanical properties of key hydrated cement constituent calcium-silicate-hydrate (CSH) at the molecular, nanometer scale level. Due to complexity, still unknown molecular configuration of CSH, a representative configuration widely accepted in the field of mineral Jennite is employed. The effectiveness of the Molecular Dynamics modeling to understand the predictive influence of material chemistry changes based on molecular / nanoscale models is demonstrated.
Keywords: Cement composite, Mechanical Properties, Molecular Dynamics, Plasticizer additives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2568308 Metaheuristic Algorithms for Decoding Binary Linear Codes
Authors: Hassan Berbia, Faissal Elbouanani, Rahal Romadi, Mostafa Belkasmi
Abstract:
This paper introduces two decoders for binary linear codes based on Metaheuristics. The first one uses a genetic algorithm and the second is based on a combination genetic algorithm with a feed forward neural network. The decoder based on the genetic algorithms (DAG) applied to BCH and convolutional codes give good performances compared to Chase-2 and Viterbi algorithm respectively and reach the performances of the OSD-3 for some Residue Quadratic (RQ) codes. This algorithm is less complex for linear block codes of large block length; furthermore their performances can be improved by tuning the decoder-s parameters, in particular the number of individuals by population and the number of generations. In the second algorithm, the search space, in contrast to DAG which was limited to the code word space, now covers the whole binary vector space. It tries to elude a great number of coding operations by using a neural network. This reduces greatly the complexity of the decoder while maintaining comparable performances.Keywords: Block code, decoding, methaheuristic, genetic algorithm, neural network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2080307 High Performance in Parallel Data Integration: An Empirical Evaluation of the Ratio Between Processing Time and Number of Physical Nodes
Authors: Caspar von Seckendorff, Eldar Sultanow
Abstract:
Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.
Keywords: Process delay, speedup, efficiency, parallel computing, data integration, E-Commerce, Amazon Elastic Compute Cloud (EC2), Hadoop, Nutch.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1629306 Toward an Open Network Business Approach
Authors: Valentina Ndou, Laura Schina, Giuseppina Passiante, Pasquale Del Vecchio, Marco De Maggio
Abstract:
The aim of this paper is to propose a dynamic integrated approach, based on modularity concept and on the business ecosystem approach, that exploit different eBusiness services for SMEs under an open business network platform. The adoption of this approach enables firms to collaborate locally for delivering the best product/service to the customers as well as globally by accessing international markets, interrelate directly with the customers, create relationships and collaborate with worldwide actors. The paper will be structured as following: We will start by offering an overview of the state of the art of eBusiness platforms among SME of food and tourism firms and then we discuss the main drawbacks that characterize them. The digital business ecosystem approach and the modularity concept will be described as the theoretical ground in which our proposed integrated model is rooted. Finally, the proposed model along with a discussion of the main value creation potentialities it might create for SMEs will be presented.
Keywords: component, Complexity; Digital Business Ecosystem, e Business Platforms, Modularity, Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464305 Enhanced Shell Sorting Algorithm
Authors: Basit Shahzad, Muhammad Tanvir Afzal
Abstract:
Many algorithms are available for sorting the unordered elements. Most important of them are Bubble sort, Heap sort, Insertion sort and Shell sort. These algorithms have their own pros and cons. Shell Sort which is an enhanced version of insertion sort, reduces the number of swaps of the elements being sorted to minimize the complexity and time as compared to insertion sort. Shell sort improves the efficiency of insertion sort by quickly shifting values to their destination. Average sort time is O(n1.25), while worst-case time is O(n1.5). It performs certain iterations. In each iteration it swaps some elements of the array in such a way that in last iteration when the value of h is one, the number of swaps will be reduced. Donald L. Shell invented a formula to calculate the value of ?h?. this work focuses to identify some improvement in the conventional Shell sort algorithm. ''Enhanced Shell Sort algorithm'' is an improvement in the algorithm to calculate the value of 'h'. It has been observed that by applying this algorithm, number of swaps can be reduced up to 60 percent as compared to the existing algorithm. In some other cases this enhancement was found faster than the existing algorithms available.Keywords: Algorithm, Computation, Shell, Sorting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3136304 Comparison of Seismic Retrofitting Methods for Existing Foundations in Seismological Active Regions
Authors: Peyman Amini Motlagh, Ali Pak
Abstract:
Seismic retrofitting of important structures is essential in seismological active zones. The importance is doubled when it comes to some buildings like schools, hospitals, bridges etc. because they are required to continue their serviceability even after a major earthquake. Generally, seismic retrofitting codes have paid little attention to retrofitting of foundations due to its construction complexity. In this paper different methods for seismic retrofitting of tall buildings’ foundations will be discussed and evaluated. Foundations are considered in three different categories. First, foundations those are in danger of liquefaction of their underlying soil. Second, foundations located on slopes in seismological active regions. Third, foundations designed according to former design codes and may show structural defects under earthquake loads. After describing different methods used in different countries for retrofitting of the existing foundations in seismological active regions, comprehensive comparison between these methods with regard to the above mentioned categories is carried out. This paper gives some guidelines to choose the best method for seismic retrofitting of tall buildings’ foundations in retrofitting projects.
Keywords: Existing foundation, landslide, liquefaction, seismic retrofitting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4289303 A Query Optimization Strategy for Autonomous Distributed Database Systems
Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam
Abstract:
Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.
Keywords: Autonomous strategies, distributed database systems, high priority, query optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1057302 Improved Processing Speed for Text Watermarking Algorithm in Color Images
Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari
Abstract:
Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.
Keywords: Steganography, watermarking, private keys, time complexity measurements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 816301 Magnetic Field Analysis for a Distribution Transformer with Unbalanced Load Conditions by using 3-D Finite Element Method
Authors: P. Meesuk, T. Kulworawanichpong, P. Pao-la-or
Abstract:
This paper proposes a set of quasi-static mathematical model of magnetic fields caused by high voltage conductors of distribution transformer by using a set of second-order partial differential equation. The modification for complex magnetic field analysis and time-harmonic simulation are also utilized. In this research, transformers were study in both balanced and unbalanced loading conditions. Computer-based simulation utilizing the threedimensional finite element method (3-D FEM) is exploited as a tool for visualizing magnetic fields distribution volume a distribution transformer. Finite Element Method (FEM) is one among popular numerical methods that is able to handle problem complexity in various forms. At present, the FEM has been widely applied in most engineering fields. Even for problems of magnetic field distribution, the FEM is able to estimate solutions of Maxwell-s equations governing the power transmission systems. The computer simulation based on the use of the FEM has been developed in MATLAB programming environment.Keywords: Distribution Transformer, Magnetic Field, Load Unbalance, 3-D Finite Element Method (3-D FEM)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2692300 Communication Design in Newspapers: A Comparative Study of Graphic Resources in Portuguese and Spanish Publications
Authors: Fátima Gonçalves, Joaquim Brigas, Jorge Gonçalves
Abstract:
As a way of managing the increasing volume and complexity of information that circulates in the present time, graphical representations are increasingly used, which add meaning to the information presented in communication media, through an efficient communication design. The visual culture itself, driven by technological evolution, has been redefining the forms of communication, so that contemporary visual communication represents a major impact on society. This article presents the results and respective comparative analysis of four publications in the Iberian press, focusing on the formal aspects of newspapers and the space they dedicate to the various communication elements. Two Portuguese newspapers and two Spanish newspapers were selected for this purpose. The findings indicated that the newspapers show a similarity in the use of graphic solutions, which corroborate a visual trend in communication design. The results also reveal that Spanish newspapers are more meticulous with graphic consistency. This study intended to contribute to improving knowledge of the Iberian generalist press.
Keywords: Communication design, graphic resources, Iberian Press, visual journalism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1223299 SVID: Structured Vulnerability Intelligence for Building Deliberated Vulnerable Environment
Authors: Wenqing Fan, Yixuan Cheng, Wei Huang
Abstract:
The diversity and complexity of modern IT systems make it almost impossible for internal teams to find vulnerabilities in all software before the software is officially released. The emergence of threat intelligence and vulnerability reporting policy has greatly reduced the burden on software vendors and organizations to find vulnerabilities. However, to prove the existence of the reported vulnerability, it is necessary but difficult for security incident response team to build a deliberated vulnerable environment from the vulnerability report with limited and incomplete information. This paper presents a structured, standardized, machine-oriented vulnerability intelligence format, that can be used to automate the orchestration of Deliberated Vulnerable Environment (DVE). This paper highlights the important role of software configuration and proof of vulnerable specifications in vulnerability intelligence, and proposes a triad model, which is called DIR (Dependency Configuration, Installation Configuration, Runtime Configuration), to define software configuration. Finally, this paper has also implemented a prototype system to demonstrate that the orchestration of DVE can be automated with the intelligence.
Keywords: DIR Triad Model, DVE, vulnerability intelligence, vulnerability recurrence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 691298 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.
Keywords: Affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, Signal Detection Theory, student engagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1262297 Improving the Security of Internet of Things Using Encryption Algorithms
Authors: Amirhossein Safi
Abstract:
Internet of things (IOT) is a kind of advanced information technology which has drawn societies’ attention. Sensors and stimulators are usually recognized as smart devices of our environment. Simultaneously, IOT security brings up new issues. Internet connection and possibility of interaction with smart devices cause those devices to involve more in human life. Therefore, safety is a fundamental requirement in designing IOT. IOT has three remarkable features: overall perception, reliable transmission, and intelligent processing. Because of IOT span, security of conveying data is an essential factor for system security. Hybrid encryption technique is a new model that can be used in IOT. This type of encryption generates strong security and low computation. In this paper, we have proposed a hybrid encryption algorithm which has been conducted in order to reduce safety risks and enhancing encryption's speed and less computational complexity. The purpose of this hybrid algorithm is information integrity, confidentiality, non-repudiation in data exchange for IOT. Eventually, the suggested encryption algorithm has been simulated by MATLAB software, and its speed and safety efficiency were evaluated in comparison with conventional encryption algorithm.
Keywords: Internet of things, security, hybrid algorithm, privacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4197296 A Fast Sensor Relocation Algorithm in Wireless Sensor Networks
Authors: Yu-Chen Kuo, Shih-Chieh Lin
Abstract:
Sensor relocation is to repair coverage holes caused by node failures. One way to repair coverage holes is to find redundant nodes to replace faulty nodes. Most researches took a long time to find redundant nodes since they randomly scattered redundant nodes around the sensing field. To record the precise position of sensor nodes, most researches assumed that GPS was installed in sensor nodes. However, high costs and power-consumptions of GPS are heavy burdens for sensor nodes. Thus, we propose a fast sensor relocation algorithm to arrange redundant nodes to form redundant walls without GPS. Redundant walls are constructed in the position where the average distance to each sensor node is the shortest. Redundant walls can guide sensor nodes to find redundant nodes in the minimum time. Simulation results show that our algorithm can find the proper redundant node in the minimum time and reduce the relocation time with low message complexity.Keywords: Coverage, distributed algorithm, sensor relocation, wireless sensor networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562295 Design of Cooperative Processes of Innovation
Authors: Suzanne Yaganeh, Janni Nielsen, Leif Bloch Rasmussen
Abstract:
This paper invites to dialogue and reflections on innovation and entrepreneurship by presenting concepts of innovation leading to the introduction of a complex theoretical framework; Cooperative Innovation (CO-IN). CO-IN is a didactic model enhancing and scaffolding processes of cooperation creating innovation drawing on a Scandinavian tradition. CO-IN is based on a cross-sectorial and multidisciplinary approach. We introduce the concept of complementarity to help capture the validity of diversity and we suggest the concept of “the space in between" to understand the creation of identity as a collective mind. We see dialogue and the use of multi modal techniques as essential tools for conceptualizations giving possibility for clarification of the complexity and diversity leading to decision making based on knowledge as commons. We introduce the didactic design and present our empirical findings from an innovation workshop in Argentina. In a final paragraph we reflect on the design as a support of the development of common ground, collective mind and collective action and the creation of knowledge as commons to facilitate innovation and entrepreneurship.Keywords: CO-operative Innovation, didactic design, dialogue and ICT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718294 Two New Low Power High Performance Full Adders with Minimum Gates
Authors: M.Hosseinghadiry, H. Mohammadi, M.Nadisenejani
Abstract:
with increasing circuits- complexity and demand to use portable devices, power consumption is one of the most important parameters these days. Full adders are the basic block of many circuits. Therefore reducing power consumption in full adders is very important in low power circuits. One of the most powerconsuming modules in full adders is XOR/XNOR circuit. This paper presents two new full adders based on two new logic approaches. The proposed logic approaches use one XOR or XNOR gate to implement a full adder cell. Therefore, delay and power will be decreased. Using two new approaches and two XOR and XNOR gates, two new full adders have been implemented in this paper. Simulations are carried out by HSPICE in 0.18μm bulk technology with 1.8V supply voltage. The results show that the ten-transistors proposed full adder has 12% less power consumption and is 5% faster in comparison to MB12T full adder. 9T is more efficient in area and is 24% better than similar 10T full adder in term of power consumption. The main drawback of the proposed circuits is output threshold loss problem.Keywords: Full adder, XNOR, Low power, High performance, Very Large Scale Integrated Circuit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2080293 Investigating the Application of Social Sustainability: A Case Study in the Egyptian Retailing Sector
Authors: Lobna Hafez, Eman Elakkad
Abstract:
Sustainability is no longer a choice for firms. To achieve sustainable supply chain, all three dimensions of sustainability should be considered. Unlike the economic and environmental aspects, social sustainability has been rarely given attention. The problem surrounding social sustainability and employees’ welfare in Egypt is complex and remains unsolved. The aim of this study is to qualitatively assess the current level of application of social sustainability in the retailing sector in Egypt through using the social sustainability indicators identified in the literature. The purpose of this investigation is to gain knowledge about the complexity of the system involved. A case study is conducted on one of the largest retailers in Egypt. Data were collected through semi-structured interviews with managers and employees to determine the level of application and identify the major obstacles affecting the social sustainability in the retailing context. The work developed gives insights about the details and complexities of the application of social sustainability in developing countries, from the retailing perspective. The outcomes of this study will help managers to understand the enablers of social sustainability and will direct them to methods of sound implementation.Keywords: Egypt, retailing sector, social sustainability, sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 568