Search results for: large airplane
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6835

Search results for: large airplane

6685 Bubble Scrum: How to Run in Organizations That Only Know How to Walk

Authors: Zaheer A. Ali, George Szabo

Abstract:

SCRUM has roots in software and web development and works very well on that in that space. However, any technical person who has watched a typical waterfall managed project spiral out of control or into an abyss, has thought: "there must be a better way". I will discuss how that thought leads naturally to adopting Agile principles and SCRUM, as well as how Agile and SCRUM can be implemented in large institutions with long histories via a method I developed: Bubble Scrum. We will also see how SCRUM can be implemented in interesting places outside of the technical sphere and also discuss where and how to subtly bring Agility and SCRUM into large, rigid, institutions.

Keywords: agile, enterprise-agile, agile at scale, agile transition, project management, scrum

Procedia PDF Downloads 129
6684 Power Generation through Water Vapour: An Approach of Using Sea/River/Lake Water as Renewable Energy Source

Authors: Riad

Abstract:

As present world needs more and more energy in a low cost way, it needs to find out the optimal way of power generation. In the sense of low cost, renewable energy is one of the greatest sources of power generation. Water vapour of sea/river/lake can be used for power generation by using the greenhouse effect in a large flat type water chamber floating on the water surface. The water chamber will always be kept half filled. When water evaporates by sunlight, the high pressured gaseous water will be stored in the chamber. By passing through a pipe and by using aerodynamics it can be used for power generation. The water level of the chamber is controlled by some means. As a large amount of water evaporates, an estimation can be highlighted, approximately 3 to 4 thousand gallons of water evaporates from per acre of surface (this amount will be more by greenhouse effect). This large amount of gaseous water can be utilized for power generation by passing through a pipe. This method can be a source of power generation.

Keywords: renewable energy, greenhouse effect, water chamber, water vapour

Procedia PDF Downloads 322
6683 Stomach Perforation, due to Chronic External Pressure

Authors: Angelis P. Barlampas

Abstract:

PURPOSE: The purpose of this paper is to demonstrate the important role of taking an appropriate and detailed history, in order to reach the best possible diagnostic conclusion. MATERIAL: A patient presented to the emergency department due to the sudden onset of continuous abdominal pain, during the last hour and with the clinical symptoms of an acute abdomen. During the clinical examination, signs of peritoneal irritation and diffuse abdominal tenderness were found. The rest of the clinical and laboratory tests did not reveal anything important. From the reported medical history, nothing of note was found, except for the report of a large liver cyst, for which he was advised not to take any further action, except from regular ultrasound examination . METHOD: A computed tomography examination was performed after per os administration of gastrografin, which revealed a hyperdense ascitic effusion, similar in density to that of gastrografin within the intestinal tract. The presence of a large cyst of the left hepatic lobe was confirmed, contacting and pushing against the stomach. In the area of the contact between the liver cyst and the pylorus, there were extraluminal air bubbles and local opacity of the peritoneal fat, with a small hyperdense effusion. Result : The above, as well as the absence of a history of stomach ulcer or recent trauma, or other pathology, argue in favor of acute pyloric perforation, due to mural necrosis, in response to chronic external pressure from the pre-existing large liver cyst.

Keywords: perforation, stomach, large liver cyst, CT abdomen, acute abdominal pain, intraperitoneal leakage, constrast leakage

Procedia PDF Downloads 52
6682 Developing NAND Flash-Memory SSD-Based File System Design

Authors: Jaechun No

Abstract:

This paper focuses on I/O optimizations of N-hybrid (New-Form of hybrid), which provides a hybrid file system space constructed on SSD and HDD. Although the promising potentials of SSD, such as the absence of mechanical moving overhead and high random I/O throughput, have drawn a lot of attentions from IT enterprises, its high ratio of cost/capacity makes it less desirable to build a large-scale data storage subsystem composed of only SSDs. In this paper, we present N-hybrid that attempts to integrate the strengths of SSD and HDD, to offer a single, large hybrid file system space. Several experiments were conducted to verify the performance of N-hybrid.

Keywords: SSD, data section, I/O optimizations, hybrid system

Procedia PDF Downloads 388
6681 Pressure Gradient Prediction of Oil-Water Two Phase Flow through Horizontal Pipe

Authors: Ahmed I. Raheem

Abstract:

In this thesis, stratified and stratified wavy flow regimes have been investigated numerically for the oil (1.57 mPa s viscosity and 780 kg/m3 density) and water twophase flow in small and large horizontal steel pipes with a diameter between 0.0254 to 0.508 m by ANSYS Fluent software. Volume of fluid (VOF) with two phases flows using two equations family models (Realizable k-

Keywords: CFD, two-phase flow, pressure gradient, volume of fluid, large diameter, horizontal pipe, oil-water stratified and stratified wavy flow

Procedia PDF Downloads 403
6680 SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space

Authors: Sanaa Chafik, Imane Daoudi, Mounim A. El Yacoubi, Hamid El Ouardi

Abstract:

Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.

Keywords: approximate nearest neighbor search, content based image retrieval (CBIR), curse of dimensionality, locality sensitive hashing, multidimensional indexing, scalability

Procedia PDF Downloads 302
6679 Error Probability of Multi-User Detection Techniques

Authors: Komal Babbar

Abstract:

Multiuser Detection is the intelligent estimation/demodulation of transmitted bits in the presence of Multiple Access Interference. The authors have presented the Bit-error rate (BER) achieved by linear multi-user detectors: Matched filter (which treats the MAI as AWGN), Decorrelating and MMSE. In this work, authors investigate the bit error probability analysis for Matched filter, decorrelating, and MMSE. This problem arises in several practical CDMA applications where the receiver may not have full knowledge of the number of active users and their signature sequences. In particular, the behavior of MAI at the output of the Multi-user detectors (MUD) is examined under various asymptotic conditions including large signal to noise ratio; large near-far ratios; and a large number of users. In the last section Authors also shows Matlab Simulation results for Multiuser detection techniques i.e., Matched filter, Decorrelating, MMSE for 2 users and 10 users.

Keywords: code division multiple access, decorrelating, matched filter, minimum mean square detection (MMSE) detection, multiple access interference (MAI), multiuser detection (MUD)

Procedia PDF Downloads 495
6678 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements

Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori

Abstract:

The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.

Keywords: apportionment, bias, divisor, fair, measurement

Procedia PDF Downloads 336
6677 Cloud Design for Storing Large Amount of Data

Authors: M. Strémy, P. Závacký, P. Cuninka, M. Juhás

Abstract:

Main goal of this paper is to introduce our design of private cloud for storing large amount of data, especially pictures, and to provide good technological backend for data analysis based on parallel processing and business intelligence. We have tested hypervisors, cloud management tools, storage for storing all data and Hadoop to provide data analysis on unstructured data. Providing high availability, virtual network management, logical separation of projects and also rapid deployment of physical servers to our environment was also needed.

Keywords: cloud, glusterfs, hadoop, juju, kvm, maas, openstack, virtualization

Procedia PDF Downloads 328
6676 The Threshold Values of Soil Water Index for Landslides on Country Road No.89

Authors: Ji-Yuan Lin, Yu-Ming Liou, Yi-Ting Chen, Chen-Syuan Lin

Abstract:

Soil water index obtained by tank model is now commonly used in soil and sand disaster alarm system in Japan. Comparing with the rainfall trigging index in Taiwan, the tank model is easy to predict the slope water content on large-scale landslide. Therefore, this study aims to estimate the threshold value of large-scale landslide using the soil water index Sixteen typhoons and heavy rainfall events, were selected to establish the, to relationship between landslide event and soil water index. Finally, the proposed threshold values for landslides on country road No.89 are suggested in this study. The study results show that 95% landslide cases occurred in soil water index more than 125mm, and 30% of the more serious slope failure occurred in the soil water index is greater than 250mm. Beside, this study speculates when soil water index more than 250mm and the difference value between second tank and third tank less than -25mm, it leads to large-scale landslide more probably.

Keywords: soil water index, tank model, landslide, threshold values

Procedia PDF Downloads 355
6675 3D Stereoscopic Measurements from AR Drone Squadron

Authors: R. Schurig, T. Désesquelles, A. Dumont, E. Lefranc, A. Lux

Abstract:

A cost-efficient alternative is proposed to the use of a single drone carrying multiple cameras in order to take stereoscopic images and videos during its flight. Such drone has to be particularly large enough to take off with its equipment, and stable enough in order to make valid measurements. Corresponding performance for a single aircraft usually comes with a large cost. Proposed solution consists in using multiple smaller and cheaper aircrafts carrying one camera each instead of a single expensive one. To give a proof of concept, AR drones, quad-rotor UAVs from Parrot Inc., are experimentally used.

Keywords: drone squadron, flight control, rotorcraft, Unmanned Aerial Vehicle (UAV), AR drone, stereoscopic vision

Procedia PDF Downloads 439
6674 Mechanical Properties and Microstructural Analyzes of Epoxy Resins Reinforced with Satin Tissue

Authors: Băilă Diana Irinel, Păcurar Răzvan, Păcurar Ancuța

Abstract:

Although the volumes of fibre reinforced polymer composites (FRPs) used for aircraft applications is a relatively small percentage of total use, the materials often find their most sophisticated applications in this industry. In aerospace, the performance criteria placed upon materials can be far greater than in other areas – key aspects are light-weight, high-strength, high-stiffness, and good fatigue resistance. Composites were first used by the military before the technology was applied to commercial planes. Nowadays, composites are widely used, and this has been the result of a gradual direct substitution of metal components followed by the development of integrated composite designs as confidence in FRPs has increased. The airplane uses a range of components made from composites, including the fin and tailplane. In the last years, composite materials are increasingly used in automotive applications due to the improvement of material properties. In the aerospace and automotive sector, the fuel consumption is proportional to the weight of the body of the vehicle. A minimum of 20% of the cost can be saved if it used polymer composites in place of the metal structures and the operating and maintenance costs are alco very low. Glass fiber-epoxy composites are widely used in the making of aircraft and automobile body parts and are not only limited to these fields but also used in ship building, structural applications in civil engineering, pipes for the transport of liquids, electrical insulators in reactors. This article was establish the high-performance of composite material, a type glass-epoxy used in automotive and aeronautic domains, concerning the tensile and flexural tests and SEM analyzes.

Keywords: glass-epoxy composite, traction and flexion tests, SEM analysis, acoustic emission (AE) signals

Procedia PDF Downloads 73
6673 A Comparative Assessment of the FoodSupply Vulnerability to Large-Scale Disasters in OECD Countries

Authors: Karolin Bauer, Anna Brinkmann

Abstract:

Vulnerabilities in critical infrastructure can cause significant difficulties for the affected population during crises. Securing the food supply as part of the critical infrastructure in crisis situations is an essential part of public services and a ground stone for a successful concept of civil protection. In most industrialized countries, there are currently no comparative studies regarding the food supply of the population during crisis and disaster events. In order to mitigate the potential impact in case of major disasters in Germany, it is absolutely necessary to investigate how the food supply can be secured. The research project aims to provide in-depth research on the experiences gathered during past large-scale disasters in the 34 OECD member countries in order to discover alternatives for an updated civil protection system in Germany. The basic research question is: "Which international approaches and structures of civil protection have been proven and would be useful to modernize the German civil protection with regards to the critical infrastructure and food supply?" Research findings should be extracted from an extensive literature review covering the entire research period as well as from personal and online-based interviews with experts and responsible persons from involved institutions. The capability of the research project insists on the deliberate choice to investigate previous large-scale disasters to formulate important and practical approaches to modernize civil protection in Germany.

Keywords: food supply, vulnerabilty, critical infratstructure, large-scale disaster

Procedia PDF Downloads 312
6672 An Automatic Large Classroom Attendance Conceptual Model Using Face Counting

Authors: Sirajdin Olagoke Adeshina, Haidi Ibrahim, Akeem Salawu

Abstract:

large lecture theatres cannot be covered by a single camera but rather by a multicamera setup because of their size, shape, and seating arrangements. Although, classroom capture is achievable through a single camera. Therefore, a design and implementation of a multicamera setup for a large lecture hall were considered. Researchers have shown emphasis on the impact of class attendance taken on the academic performance of students. However, the traditional method of carrying out this exercise is below standard, especially for large lecture theatres, because of the student population, the time required, sophistication, exhaustiveness, and manipulative influence. An automated large classroom attendance system is, therefore, imperative. The common approach in this system is face detection and recognition, where known student faces are captured and stored for recognition purposes. This approach will require constant face database updates due to constant changes in the facial features. Alternatively, face counting can be performed by cropping the localized faces on the video or image into a folder and then count them. This research aims to develop a face localization-based approach to detect student faces in classroom images captured using a multicamera setup. A selected Haar-like feature cascade face detector trained with an asymmetric goal to minimize the False Rejection Rate (FRR) relative to the False Acceptance Rate (FAR) was applied on Raspberry Pi 4B. A relationship between the two factors (FRR and FAR) was established using a constant (λ) as a trade-off between the two factors for automatic adjustment during training. An evaluation of the proposed approach and the conventional AdaBoost on classroom datasets shows an improvement of 8% TPR (output result of low FRR) and 7% minimization of the FRR. The average learning speed of the proposed approach was improved with 1.19s execution time per image compared to 2.38s of the improved AdaBoost. Consequently, the proposed approach achieved 97% TPR with an overhead constraint time of 22.9s compared to 46.7s of the improved Adaboost when evaluated on images obtained from a large lecture hall (DK5) USM.

Keywords: automatic attendance, face detection, haar-like cascade, manual attendance

Procedia PDF Downloads 48
6671 Social Media and the Future of Veganism Influence on Gender Norms

Authors: Athena Johnson

Abstract:

Veganism has seen a rapid increase in members over recent years. Understanding the mechanisms of social change associated with these dietary practices in relation to gender is significant as these groups may seem small, but they have a large impact as they influence many and change the food market. This research article's basic methodology is primarily a deep article research literature review with empirical research. The research findings show that the popularity of veganism is growing, in large part due to the extensive use of social media, which dispels longstanding gendered connotations with food, such as the correlations between meat and masculinity.

Keywords: diversity, gender roles, social media, veganism

Procedia PDF Downloads 85
6670 EduEasy: Smart Learning Assistant System

Authors: A. Karunasena, P. Bandara, J. A. T. P. Jayasuriya, P. D. Gallage, J. M. S. D. Jayasundara, L. A. P. Y. P. Nuwanjaya

Abstract:

Usage of smart learning concepts has increased rapidly all over the world recently as better teaching and learning methods. Most educational institutes such as universities are experimenting those concepts with their students. Smart learning concepts are especially useful for students to learn better in large classes. In large classes, the lecture method is the most popular method of teaching. In the lecture method, the lecturer presents the content mostly using lecture slides, and the students make their own notes based on the content presented. However, some students may find difficulties with the above method due to various issues such as speed in delivery. The purpose of this research is to assist students in large classes in the following content. The research proposes a solution with four components, namely note-taker, slide matcher, reference finder, and question presenter, which are helpful for the students to obtain a summarized version of the lecture note, easily navigate to the content and find resources, and revise content using questions.

Keywords: automatic summarization, extractive text summarization, speech recognition library, sentence extraction, automatic web search, automatic question generator, sentence scoring, the term weight

Procedia PDF Downloads 119
6669 Energy Budget Equation of Superfluid HVBK Model: LES Simulation

Authors: M. Bakhtaoui, L. Merahi

Abstract:

The reliability of the filtered HVBK model is now investigated via some large eddy simulations of freely decaying isotropic superfluid turbulence. For homogeneous turbulence at very high Reynolds numbers, comparison of the terms in the spectral kinetic energy budget equation indicates, in the energy-containing range, that the production and energy transfer effects become significant except for dissipation. In the inertial range, where the two fluids are perfectly locked, the mutual friction maybe neglected with respect to other terms. Also the LES results for the other terms of the energy balance are presented.

Keywords: superfluid turbulence, HVBK, energy budget, Large Eddy Simulation

Procedia PDF Downloads 347
6668 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 147
6667 The Search of Anomalous Higgs Boson Couplings at the Large Hadron Electron Collider and Future Circular Electron Hadron Collider

Authors: Ilkay Turk Cakir, Murat Altinli, Zekeriya Uysal, Abdulkadir Senol, Olcay Bolukbasi Yalcinkaya, Ali Yilmaz

Abstract:

The Higgs boson was discovered by the ATLAS and CMS experimental groups in 2012 at the Large Hadron Collider (LHC). Production and decay properties of the Higgs boson, Standard Model (SM) couplings, and limits on effective scale of the Higgs boson’s couplings with other bosons are investigated at particle colliders. Deviations from SM estimates are parametrized by effective Lagrangian terms to investigate Higgs couplings. This is a model-independent method for describing the new physics. In this study, sensitivity to neutral gauge boson anomalous couplings with the Higgs boson is investigated using the parameters of the Large Hadron electron Collider (LHeC) and the Future Circular electron-hadron Collider (FCC-eh) with a model-independent approach. By using MadGraph5_aMC@NLO multi-purpose event generator with the parameters of LHeC and FCC-eh, the bounds on the anomalous Hγγ, HγZ and HZZ couplings in e− p → e− q H process are obtained. Detector simulations are also taken into account in the calculations.

Keywords: anomalos couplings, FCC-eh, Higgs, Z boson

Procedia PDF Downloads 185
6666 Shape Management Method of Large Structure Based on Octree Space Partitioning

Authors: Gichun Cha, Changgil Lee, Seunghee Park

Abstract:

The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."

Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning

Procedia PDF Downloads 275
6665 Imputation of Incomplete Large-Scale Monitoring Count Data via Penalized Estimation

Authors: Mohamed Dakki, Genevieve Robin, Marie Suet, Abdeljebbar Qninba, Mohamed A. El Agbani, Asmâa Ouassou, Rhimou El Hamoumi, Hichem Azafzaf, Sami Rebah, Claudia Feltrup-Azafzaf, Nafouel Hamouda, Wed a.L. Ibrahim, Hosni H. Asran, Amr A. Elhady, Haitham Ibrahim, Khaled Etayeb, Essam Bouras, Almokhtar Saied, Ashrof Glidan, Bakar M. Habib, Mohamed S. Sayoud, Nadjiba Bendjedda, Laura Dami, Clemence Deschamps, Elie Gaget, Jean-Yves Mondain-Monval, Pierre Defos Du Rau

Abstract:

In biodiversity monitoring, large datasets are becoming more and more widely available and are increasingly used globally to estimate species trends and con- servation status. These large-scale datasets challenge existing statistical analysis methods, many of which are not adapted to their size, incompleteness and heterogeneity. The development of scalable methods to impute missing data in incomplete large-scale monitoring datasets is crucial to balance sampling in time or space and thus better inform conservation policies. We developed a new method based on penalized Poisson models to impute and analyse incomplete monitoring data in a large-scale framework. The method al- lows parameterization of (a) space and time factors, (b) the main effects of predic- tor covariates, as well as (c) space–time interactions. It also benefits from robust statistical and computational capability in large-scale settings. The method was tested extensively on both simulated and real-life waterbird data, with the findings revealing that it outperforms six existing methods in terms of missing data imputation errors. Applying the method to 16 waterbird species, we estimated their long-term trends for the first time at the entire North African scale, a region where monitoring data suffer from many gaps in space and time series. This new approach opens promising perspectives to increase the accuracy of species-abundance trend estimations. We made it freely available in the r package ‘lori’ (https://CRAN.R-project.org/package=lori) and recommend its use for large- scale count data, particularly in citizen science monitoring programmes.

Keywords: biodiversity monitoring, high-dimensional statistics, incomplete count data, missing data imputation, waterbird trends in North-Africa

Procedia PDF Downloads 118
6664 Using the Semantic Web Technologies to Bring Adaptability in E-Learning Systems

Authors: Fatima Faiza Ahmed, Syed Farrukh Hussain

Abstract:

The last few decades have seen a large proportion of our population bending towards e-learning technologies, starting from learning tools used in primary and elementary schools to competency based e-learning systems specifically designed for applications like finance and marketing. The huge diversity in this crowd brings about a large number of challenges for the designers of these e-learning systems, one of which is the adaptability of such systems. This paper focuses on adaptability in the learning material in an e-learning course and how artificial intelligence and the semantic web can be used as an effective tool for this purpose. The study proved that the semantic web, still a hot topic in the area of computer science can prove to be a powerful tool in designing and implementing adaptable e-learning systems.

Keywords: adaptable e-learning, HTMLParser, information extraction, semantic web

Procedia PDF Downloads 288
6663 Exploring the Sources of Innovation in Food Processing SMEs of Kerala

Authors: Bhumika Gupta, Jeayaram Subramanian, Hardik Vachhrajani, Avinash Shivdas

Abstract:

Indian food processing industry is one of the largest in the world in terms of production, consumption, exports and growth opportunities. SMEs play a crucial role within this. Large manufacturing firms largely dominate innovation studies in India. Innovation sources used by SMEs are often different from that of large firms. This paper focuses on exploring various sources of innovation adopted by food processing SMEs in Kerala, South India. Outcome suggests that SMEs use various sources like suppliers, competitors, employees, government/research institutions and customers to get new ideas.

Keywords: food processing, innovation, SMEs, sources of innovation

Procedia PDF Downloads 385
6662 Hyperspectral Image Classification Using Tree Search Algorithm

Authors: Shreya Pare, Parvin Akhter

Abstract:

Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.

Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm

Procedia PDF Downloads 138
6661 Steepest Descent Method with New Step Sizes

Authors: Bib Paruhum Silalahi, Djihad Wungguli, Sugi Guritman

Abstract:

Steepest descent method is a simple gradient method for optimization. This method has a slow convergence in heading to the optimal solution, which occurs because of the zigzag form of the steps. Barzilai and Borwein modified this algorithm so that it performs well for problems with large dimensions. Barzilai and Borwein method results have sparked a lot of research on the method of steepest descent, including alternate minimization gradient method and Yuan method. Inspired by previous works, we modified the step size of the steepest descent method. We then compare the modification results against the Barzilai and Borwein method, alternate minimization gradient method and Yuan method for quadratic function cases in terms of the iterations number and the running time. The average results indicate that the steepest descent method with the new step sizes provide good results for small dimensions and able to compete with the results of Barzilai and Borwein method and the alternate minimization gradient method for large dimensions. The new step sizes have faster convergence compared to the other methods, especially for cases with large dimensions.

Keywords: steepest descent, line search, iteration, running time, unconstrained optimization, convergence

Procedia PDF Downloads 519
6660 Factors Influencing Milk Yield, Quality, and Revenue of Dairy Farms in Southern Vietnam

Authors: Ngoc-Hieu Vu

Abstract:

Dairy production in Vietnam is a relatively new agricultural activity and milk production increased remarkably in recent years. Smallholders are still the main drivers for this development, especially in the southern part of the country. However, information on the farming practices is very limited. Therefore, this study aimed to determine factors influencing milk yield and quality (milk fat, total solids, solids-not-fat, total number of bacteria, and somatic cell count) and revenue of dairy farms in Southern Vietnam. The collection of data was at the farm level; individual animal records were unavailable. The 539 studied farms were located in the provinces Lam Dong (N=111 farms), Binh Duong (N=69 farms), Long An (N=174 farms), and Ho Chi Minh city (N=185 farms). The dataset included 9221 monthly test-day records of the farms from January 2013 to May 2015. Seasons were defined as rainy and dry. Farms sizes were classified as small (< 10 milking cows), medium (10 to 19 milking cows) and large (≥ 20 milking cows). The model for each trait contained year-season and farm region-farm size as subclass fixed effects, and individual farm and residual as random effects. Results showed that year-season, region, and farm size were determining sources of variation affecting all studied traits. Milk yield was higher in dry than in rainy seasons (P < 0.05), while it tended to increase from years 2013 to 2015. Large farms had higher yields (445.6 kg/cow) than small (396.7 kg/cow) and medium (428.0 kg/cow) farms (P < 0.05). Small farms, in contrast, were superior to large farms in terms of milk fat, total solids, solids-not-fat, total number of bacteria, and somatic cell count than large farms (P < 0.05). Revenue per cow was higher in large compared with medium and small farms. In conclusion, large farms achieved higher milk yields and revenues per cow, while small farms were superior in milk quality. Overall, milk yields were low and better training, financial support and marketing opportunities for farmers are needed to improve dairy production and increase farm revenues in Southern Vietnam.

Keywords: farm size, milk yield and quality, season, Southern Vietnam

Procedia PDF Downloads 333
6659 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 343
6658 Production and Distribution Network Planning Optimization: A Case Study of Large Cement Company

Authors: Lokendra Kumar Devangan, Ajay Mishra

Abstract:

This paper describes the implementation of a large-scale SAS/OR model with significant pre-processing, scenario analysis, and post-processing work done using SAS. A large cement manufacturer with ten geographically distributed manufacturing plants for two variants of cement, around 400 warehouses serving as transshipment points, and several thousand distributor locations generating demand needed to optimize this multi-echelon, multi-modal transport supply chain separately for planning and allocation purposes. For monthly planning as well as daily allocation, the demand is deterministic. Rail and road networks connect any two points in this supply chain, creating tens of thousands of such connections. Constraints include the plant’s production capacity, transportation capacity, and rail wagon batch size constraints. Each demand point has a minimum and maximum for shipments received. Price varies at demand locations due to local factors. A large mixed integer programming model built using proc OPTMODEL decides production at plants, demand fulfilled at each location, and the shipment route to demand locations to maximize the profit contribution. Using base SAS, we did significant pre-processing of data and created inputs for the optimization. Using outputs generated by OPTMODEL and other processing completed using base SAS, we generated several reports that went into their enterprise system and created tables for easy consumption of the optimization results by operations.

Keywords: production planning, mixed integer optimization, network model, network optimization

Procedia PDF Downloads 33
6657 Analysis of 3 dB Directional Coupler Based On Silicon-On-Insulator (SOI) Large Cross-Section Rib Waveguide

Authors: Nurdiani Zamhari, Abang Annuar Ehsan

Abstract:

The 3 dB directional coupler is designed by using silicon-on-insulator (SOI) large cross-section and simulate by Beam Propagation Method at the communication wavelength of 1.55 µm and 1.48 µm. The geometry is shaped with rib height (H) of 6 µm and varied in step factor (r) which is 0.5, 0.6, 0.7 and 0.8. The wave guide spacing is also fixed to 5 µm and the slab width is symmetrical. In general, the 3 dB coupling lengths for four different cross-sections are several millimetre long. The 1.48 of wavelength give the longer coupling length if compare to 1.55 at the same step factor (r). Besides, the low loss propagation is achieved with less than 2 % of propagation loss.

Keywords: 3 dB directional couplers, silicon-on-insulator, symmetrical rib waveguide, OptiBPM 9

Procedia PDF Downloads 486
6656 Numerical Modeling of Large Scale Dam Break Flows

Authors: Amanbek Jainakov, Abdikerim Kurbanaliev

Abstract:

The work presents the results of mathematical modeling of large-scale flows in areas with a complex topographic relief. The Reynolds-averaged Navier—Stokes equations constitute the basis of the three-dimensional unsteady modeling. The well-known Volume of Fluid method implemented in the solver interFoam of the open package OpenFOAM 2.3 is used to track the free-boundary location. The mathematical model adequacy is checked by comparing with experimental data. The efficiency of the applied technology is illustrated by the example of modeling the breakthrough of the dams of the Andijan (Uzbekistan) and Papan (near the Osh town, Kyrgyzstan) reservoir.

Keywords: three-dimensional modeling, free boundary, the volume-of-fluid method, dam break, flood, OpenFOAM

Procedia PDF Downloads 373