Search results for: Software Architecture.
1844 Using Artificial Neural Networks for Optical Imaging of Fluorescent Biomarkers
Authors: K. A. Laptinskiy, S. A. Burikov, A. M. Vervald, S. A. Dolenko, T. A. Dolenko
Abstract:
The article presents the results of the application of artificial neural networks to separate the fluorescent contribution of nanodiamonds used as biomarkers, adsorbents and carriers of drugs in biomedicine, from a fluorescent background of own biological fluorophores. The principal possibility of solving this problem is shown. Use of neural network architecture let to detect fluorescence of nanodiamonds against the background autofluorescence of egg white with high accuracy - better than 3 ug/ml.
Keywords: Artificial neural networks, fluorescence, data aggregation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21091843 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).
Keywords: Time series modelling, stochastic processes, ARIMA model, Karkheh River.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10441842 Domain Driven Design vs Soft Domain Driven Design Frameworks
Authors: Mohammed Salahat, Steve Wade
Abstract:
This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.Keywords: SSM, UML, domain-driven design, soft domain-driven design, naked objects, soft language, information retrieval, multimethodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19781841 Neural Networks: From Black Box towards Transparent Box Application to Evapotranspiration Modeling
Authors: A. Johannet, B. Vayssade, D. Bertin
Abstract:
Neural networks are well known for their ability to model non linear functions, but as statistical methods usually does, they use a no parametric approach thus, a priori knowledge is not obvious to be taken into account no more than the a posteriori knowledge. In order to deal with these problematics, an original way to encode the knowledge inside the architecture is proposed. This method is applied to the problem of the evapotranspiration inside karstic aquifer which is a problem of huge utility in order to deal with water resource.Keywords: Neural-Networks, Hydrology, Evapotranpiration, Hidden Function Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18081840 A Case Study of Applying Virtual Prototyping in Construction
Authors: Stephen C. W. Kong
Abstract:
The use of 3D computer-aided design (CAD) models to support construction project planning has been increasing in the previous year. 3D CAD models reveal more planning ideas by visually showing the construction site environment in different stages of the construction process. Using 3D CAD models together with scheduling software to prepare construction plan can identify errors in process sequence and spatial arrangement, which is vital to the success of a construction project. A number of 4D (3D plus time) CAD tools has been developed and utilized in different construction projects due to the awareness of their importance. Virtual prototyping extends the idea of 4D CAD by integrating more features for simulating real construction process. Virtual prototyping originates from the manufacturing industry where production of products such as cars and airplanes are virtually simulated in computer before they are built in the factory. Virtual prototyping integrates 3D CAD, simulation engine, analysis tools (like structural analysis and collision detection), and knowledgebase to streamline the whole product design and production process. In this paper, we present the application of a virtual prototyping software which has been used in a few construction projects in Hong Kong to support construction project planning. Specifically, the paper presents an implementation of virtual prototyping in a residential building project in Hong Kong. The applicability, difficulties and benefits of construction virtual prototyping are examined based on this project.Keywords: construction project planning, prefabrication, simulation, virtual prototyping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28261839 A Software Tool Design for Cerebral Infarction of MR Images
Authors: Kyoung-Jong Park, Woong-Gi Jeon, Hee-Cheol Kim, Dong-Eog Kim, Heung-Kook Choi
Abstract:
The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.
Keywords: Software tool design, Cerebral infarction, Brain MR image, Registration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16631838 Investigating the Role of Community in Heritage Conservation through the Ladder of Citizen Participation Approach: Case Study, Port Said, Egypt
Authors: Sara S. Fouad, Omneya Messallam
Abstract:
Egypt has countless prestigious buildings and diversity of cultural heritage which are located in many cities. Most of the researchers, archaeologists, stakeholders and governmental bodies are paying more attention to the big cities such as Cairo and Alexandria, due to the country’s centralization nature. However, there are other historic cities that are grossly neglected and in need of emergency conservation. For instance, Port Said which is a former colonial city that was established in nineteenth century located at the edge of the northeast Egyptian coast between the Mediterranean Sea and the Suez Canal. This city is chosen because it presents one of the important Egyptian archaeological sites that archive Egyptian architecture of the 19th and 20th centuries. The historic urban fabric is divided into three main districts; the Arab, the European (Al-Afrang), and Port Fouad. The European district is selected to be the research case study as it has culture diversity, significant buildings, and includes the largest number of the listed heritage buildings in Port Said. Based on questionnaires and interviews, since 2003 several initiative trials have been taken by Alliance Francaise, the National Organization for Urban Harmony (NOUH), some Non-Governmental Organizations (NGOs), and few number of community residents to highlight the important city legacy and protect it from being demolished. Unfortunately, the limitation of their participation in decision-making policies is considered a crucial threat facing sustainable heritage conservation. Therefore, encouraging the local community to participate in their architecture heritage conservation would create a self-confident one, capable of making decisions for the city’s future development. This paper aims to investigate the role of the local inhabitants in protecting their buildings heritage through listing the community level of participations twice (2012 and 2018) in preserving their heritage based on the ladder citizen participation approach. Also, it is to encourage community participation in order to promote city architecture conservation, heritage management, and sustainable development. The methodology followed in this empirical research involves using several data assembly methods such as structural observations, questionnaires, interviews, and mental mapping. The questionnaire was distributed among 92 local inhabitants aged 18-60 years. However, the outset of this research at the beginning demonstrated the majority negative attitude, motivation, and confidence of the local inhabitants’ role to safeguard their architectural heritage. Over time, there was a change in the negative attitudes. Therefore, raising public awareness and encouraging community participation by providing them with a real opportunity to take part in the decision-making. This may lead to a positive relationship between the community residents and the built heritage, which is essential for promoting its preservation and sustainable development.
Keywords: Al-Afrang/Port Said, community participation, heritage conservation, ladder of citizen participation, NGOs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14901837 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation
Authors: G. Settanni, A. Panarese, R. Vaira, A. Galiano
Abstract:
Nowadays, artificial intelligence is used successfully in the field of e-commerce for its ability to learn from a large amount of data. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them the most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Also, Long Short-Term Memory algorithms have been implemented and trained on historical data in order to predict customer scores of the different items. Items with the highest scores are recommended to customers.
Keywords: Deep Learning, Long Short-Term Memory, Machine Learning, Recommender Systems, Support Vector Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3261836 Paddy/Rice Singulation for Determination of Husking Efficiency and Damage Using Machine Vision
Authors: M. Shaker, S. Minaei, M. H. Khoshtaghaza, A. Banakar, A. Jafari
Abstract:
In this study a system of machine vision and singulation was developed to separate paddy from rice and determine paddy husking and rice breakage percentages. The machine vision system consists of three main components including an imaging chamber, a digital camera, a computer equipped with image processing software. The singulation device consists of a kernel holding surface, a motor with vacuum fan, and a dimmer. For separation of paddy from rice (in the image), it was necessary to set a threshold. Therefore, some images of paddy and rice were sampled and the RGB values of the images were extracted using MATLAB software. Then mean and standard deviation of the data were determined. An Image processing algorithm was developed using MATLAB to determine paddy/rice separation and rice breakage and paddy husking percentages, using blue to red ratio. Tests showed that, a threshold of 0.75 is suitable for separating paddy from rice kernels. Results from the evaluation of the image processing algorithm showed that the accuracies obtained with the algorithm were 98.36% and 91.81% for paddy husking and rice breakage percentage, respectively. Analysis also showed that a suction of 45 mmHg to 50 mmHg yielding 81.3% separation efficiency is appropriate for operation of the kernel singulation system.
Keywords: Computer vision, rice kernel, husking, breakage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15311835 Evaluation of Numerical Modeling of Jet Grouting Design Using in situ Loading Test
Authors: Reza Ziaie Moayed, Ehsan Azini
Abstract:
Jet grouting (JG) is one of the methods of improving and increasing the strength and bearing of soil in which the high pressure water or grout is injected through the nozzles into the soil. During this process, a part of the soil and grout particles comes out of the drill borehole, and the other part is mixed up with the grout in place, as a result of this process, a mass of modified soil is created. The purpose of this method is to change the soil into a mixture of soil and cement, commonly known as "soil-cement". In this paper, first, the principles of high pressure injection and then the effective parameters in the JG method are described. Then, the tests on the samples taken from the columns formed from the excavation around the soil-cement columns, as well as the static loading test on the created column, are discussed. In the other part of this paper, the soil behavior models for numerical modeling in PLAXIS software are mentioned. The purpose of this paper is to evaluate the results of numerical modeling based on in-situ static loading tests. The results indicate an acceptable agreement between the results of the tests mentioned and the modeling results. Also, modeling with this software as an appropriate option for technical feasibility can be used to soil improvement using JG.
Keywords: Jet grouting column, Soil improvement, Numerical modeling, In-situ loading test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10371834 New Curriculum Approach in Teaching Network Security Subjects for ICT Courses in Malaysia
Authors: Mohd Fairuz Iskandar Othman, Nazrulazhar Bahaman, Zulkiflee Muslim, Faizal Abdollah
Abstract:
This paper discusses a curriculum approach that will give emphasis on practical portions of teaching network security subjects in information and communication technology courses. As we are well aware, the need to use a practice and application oriented approach in education is paramount. Research on active learning and cooperative groups have shown that students grasps more and have more tendency towards obtaining and realizing soft skills like leadership, communication and team work as opposed to the more traditional theory and exam based teaching and learning. While this teaching and learning paradigm is relatively new in Malaysia, it has been practiced widely in the West. This paper examines a certain approach whereby students learning wireless security are divided into and work in small and manageable groups where there will be 2 teams which consist of black hat and white hat teams. The former will try to find and expose vulnerabilities in a wireless network while the latter will try their best to prevent such attacks on their wireless networks using hardware, software, design and enforcement of security policy and etc. This paper will try to show that the approach taken plus the use of relevant and up to date software and hardware and with suitable environment setting will hopefully expose students to a more fruitful outcome in terms of understanding of concepts, theories and their motivation to learn.Keywords: Curriculum approach, wireless networks, wirelesssecurity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17011833 3.5-bit Stage of the CMOS Pipeline ADC
Authors: Gao Wei, Xu Minglu, Xu Yan, Zhang Xiaotong, Wang Xinghua
Abstract:
A 3.5-bit stage of the CMOS pipelined ADC is proposed. In this report, the main part of 3.5-bit stage ADC is introduced. How the MDAC, comparator and encoder worked and designed are shown in details. Besides, an OTA which is used in fully differential pipelined ADC was described. Using gain-boost architecture with differential amplifier, this OTA achieve high-gain and high-speed. This design was using CMOS 0.18um process and simulation in Cadence. The result of the simulation shows that the OTA has a gain up to 80dB, the unity gain bandwidth of about 1.138GHz with 2pF load.
Keywords: pipelined ADC, MDAC, operational amplifier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35531832 Extending BDI Multiagent Systems with Agent Norms
Authors: Francisco José Plácido da Cunha, Tassio Ferenzini Martins Sirqueira, Marx Leles Viana, Carlos José Pereira de Lucena
Abstract:
Open Multiagent Systems (MASs) are societies in which heterogeneous and independently designed entities (agents) work towards similar, or different ends. Software agents are autonomous and the diversity of interests among different members living in the same society is a fact. In order to deal with this autonomy, these open systems use mechanisms of social control (norms) to ensure a desirable social order. This paper considers the following types of norms: (i) obligation — agents must accomplish a specific outcome; (ii) permission — agents may act in a particular way, and (iii) prohibition — agents must not act in a specific way. All of these characteristics mean to encourage the fulfillment of norms through rewards and to discourage norm violation by pointing out the punishments. Once the software agent decides that its priority is the satisfaction of its own desires and goals, each agent must evaluate the effects associated to the fulfillment of one or more norms before choosing which one should be fulfilled. The same applies when agents decide to violate a norm. This paper also introduces a framework for the development of MASs that provide support mechanisms to the agent’s decision-making, using norm-based reasoning. The applicability and validation of this approach is demonstrated applying a traffic intersection scenario.Keywords: BDI aAgent, BDI4JADE framework, multiagent system, normative agents.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9981831 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis
Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha
Abstract:
Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.
Keywords: Shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8211830 Efficient Program Slicing Algorithms for Measuring Functional Cohesion and Parallelism
Authors: Jehad Al Dallal
Abstract:
Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. In this paper, algorithms are introduced to compute all backward and forward static slices of a computer program by traversing the program representation graph once. The program representation graph used in this paper is called Program Dependence Graph (PDG). We have conducted an experimental comparison study using 25 software modules to show the effectiveness of the introduced algorithm for computing all backward static slices over single-point slicing approaches in computing the parallelism and functional cohesion of program modules. The effectiveness of the algorithm is measured in terms of time execution and number of traversed PDG edges. The comparison study results indicate that using the introduced algorithm considerably saves the slicing time and effort required to measure module parallelism and functional cohesion.
Keywords: Backward slicing, cohesion measure, forward slicing, parallelism measure, program dependence graph, program slicing, static slicing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14481829 Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project
Authors: V. Balanica, W. I. D. Rae, M. Caramihai, S. Acho, C. P. Herbst
Abstract:
Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.Keywords: Database Integration, Mammogram Classification, Tumour Classification, Computer Aided Diagnosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19451828 SAĞLIK-NET Project in Turkey and HL7 v3 Implementation
Authors: K. Turhan, B. Kurt, E. Uzun
Abstract:
This paper describes Clinical Document Architecture Release Two (CDA R2) standard and a client application for messaging with SAĞLIK-NET project developed by The Ministry of Health of Turkey. CDA R2 , developed by Health Level 7 (HL7) organization and approved by American National Standards Institute (ANSI) in 2004, to standardize medical information to be able to share semantically and syntactically. In this study, a client application compatible with HL7 V3 for a project named SAĞLIKNET, aimed to build a National Health Information System by Turkey. Moreover, CDA conformance of this application will also be evaluated.
Keywords: HL7 V3, CDA, Interoperability, Web Service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36321827 Accurate Position Electromagnetic Sensor Using Data Acquisition System
Authors: Z. Ezzouine, A. Nakheli
Abstract:
This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.
Keywords: Electromagnetic sensor, data acquisition, accurately, position measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9611826 Formex Algebra Adaptation into Parametric Design Tools: Dome Structures
Authors: Réka Sárközi, Péter Iványi, Attila B. Széll
Abstract:
The aim of this paper is to present the adaptation of the dome construction tool for formex algebra to the parametric design software Grasshopper. Formex algebra is a mathematical system, primarily used for planning structural systems such like truss-grid domes and vaults, together with the programming language Formian. The goal of the research is to allow architects to plan truss-grid structures easily with parametric design tools based on the versatile formex algebra mathematical system. To produce regular structures, coordinate system transformations are used and the dome structures are defined in spherical coordinate system. Owing to the abilities of the parametric design software, it is possible to apply further modifications on the structures and gain special forms. The paper covers the basic dome types, and also additional dome-based structures using special coordinate-system solutions based on spherical coordinate systems. It also contains additional structural possibilities like making double layer grids in all geometry forms. The adaptation of formex algebra and the parametric workflow of Grasshopper together give the possibility of quick and easy design and optimization of special truss-grid domes.Keywords: Parametric design, structural morphology, space structures, spherical coordinate system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14591825 Definition of Cognitive Infocommunications and an Architectural Implementation of Cognitive Infocommunications Systems
Authors: Peter Baranyi, Gyorgy Persa, Adam Csapo
Abstract:
Cognitive Infocommunications (CogInfoCom) is a new research direction which has emerged as the synergic convergence of infocommunications and the cognitive sciences. In this paper, we provide the definition of CogInfoCom, and propose an architectural framework for the interaction-oriented design of CogInfoCom systems. We provide the outlines of an application example of the interaction-oriented architecture, and briefly discuss its main characteristics.Keywords: Cognitive infocommunications, CogInfoCom, Cognitive Infocommunication Channels, CogInfoCom channels
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17621824 Automation of the Maritime UAV Command, Control, Navigation Operations, Simulated in Real-Time Using Kinect Sensor: A Feasibility Study
Authors: Regius Asiimwe, Amir Anvar
Abstract:
This paper describes the process used in the automation of the Maritime UAV commands using the Kinect sensor. The AR Drone is a Quadrocopter manufactured by Parrot [1] to be controlled using the Apple operating systems such as iPhones and Ipads. However, this project uses the Microsoft Kinect SDK and Microsoft Visual Studio C# (C sharp) software, which are compatible with Windows Operating System for the automation of the navigation and control of the AR drone. The navigation and control software for the Quadrocopter runs on a windows 7 computer. The project is divided into two sections; the Quadrocopter control system and the Kinect sensor control system. The Kinect sensor is connected to the computer using a USB cable from which commands can be sent to and from the Kinect sensors. The AR drone has Wi-Fi capabilities from which it can be connected to the computer to enable transfer of commands to and from the Quadrocopter. The project was implemented in C#, a programming language that is commonly used in the automation systems. The language was chosen because there are more libraries already established in C# for both the AR drone and the Kinect sensor. The study will contribute toward research in automation of systems using the Quadrocopter and the Kinect sensor for navigation involving a human operator in the loop. The prototype created has numerous applications among which include the inspection of vessels such as ship, airplanes and areas that are not accessible by human operators.Keywords: UAV, AR drone, Kinect Sensors, Automation, Real time, C sharp, Microsoft Kinect SDK.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29311823 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital
Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri
Abstract:
Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.Keywords: Systems modeling, ED operation, workflow modeling, systems analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10421822 Impact of the Non-Energy Sectors Diversification on the Energy Dependency Mitigation: Visualization by the “IntelSymb” Software Application
Authors: Ilaha Rzayeva, Emin Alasgarov, Orkhan Karim-Zada
Abstract:
This study attempts to consider the linkage between management and computer sciences in order to develop the software named “IntelSymb” as a demo application to prove data analysis of non-energy* fields’ diversification, which will positively influence on energy dependency mitigation of countries. Afterward, we analyzed 18 years of economic fields of development (5 sectors) of 13 countries by identifying which patterns mostly prevailed and which can be dominant in the near future. To make our analysis solid and plausible, as a future work, we suggest developing a gateway or interface, which will be connected to all available on-line data bases (WB, UN, OECD, U.S. EIA) for countries’ analysis by fields. Sample data consists of energy (TPES and energy import indicators) and non-energy industries’ (Main Science and Technology Indicator, Internet user index, and Sales and Production indicators) statistics from 13 OECD countries over 18 years (1995-2012). Our results show that the diversification of non-energy industries can have a positive effect on energy sector dependency (energy consumption and import dependence on crude oil) deceleration. These results can provide empirical and practical support for energy and non-energy industries diversification’ policies, such as the promoting of Information and Communication Technologies (ICTs), services and innovative technologies efficiency and management, in other OECD and non-OECD member states with similar energy utilization patterns and policies. Industries, including the ICT sector, generate around 4 percent of total GHG, but this is much higher — around 14 percent — if indirect energy use is included. The ICT sector itself (excluding the broadcasting sector) contributes approximately 2 percent of global GHG emissions, at just under 1 gigatonne of carbon dioxide equivalent (GtCO2eq). Ergo, this can be a good example and lesson for countries which are dependent and independent on energy, and mainly emerging oil-based economies, as well as to motivate non-energy industries diversification in order to be ready to energy crisis and to be able to face any economic crisis as well.Keywords: Energy policy, energy diversification, “IntelSymb” software, renewable energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16971821 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation
Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke
Abstract:
Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.Keywords: Automatic calibration framework, approximate Bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17401820 Design and Application of NFC-Based Identity and Access Management in Cloud Services
Authors: Shin-Jer Yang, Kai-Tai Yang
Abstract:
In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.
Keywords: Cloud service, multi-tenancy, NFC, IAM, mobile device.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11181819 Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology
Authors: P. Kowalska, P. Gabka, K. Kamieniarz, M. Kamieniarz, W. Stryla, P. Guzik, T. Krauze
Abstract:
We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.74Keywords: Biomedical data base processing, Computer software, Hand dexterity, Heart rate and blood pressure variability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14751818 Signalling Cost Analysis of PDE-NEMO
Authors: Kamarularifin Abd Jalil, John Dunlop
Abstract:
A Personal Distributed Environment (PDE) is an example of an IP-based system architecture designed for future mobile communications. In a single PDE, there exist several Subnetworks hosting devices located across the infrastructure, which will inter-work with one another through the coordination of a Device Management Entity (DME). Some of these Sub-networks are fixed and some are mobile. In order to support Mobile Sub-networks mobility in the PDE, the PDE-NEMO protocol was proposed. This paper discussed the signalling cost analysis of PDE-NEMO by use of a detailed simulation model. The paper started with the introduction of the protocol, followed by the experiments and results and then followed by discussions.Keywords: Mobile Network, PDE-NEMO, Signallling Cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13971817 Wireless Backhauling for 5G Small Cell Networks
Authors: Abdullah A. Al Orainy
Abstract:
Small cell backhaul solutions need to be cost-effective, scalable, and easy to install. This paper presents an overview of small cell backhaul technologies. Wireless solutions including TV white space, satellite, sub-6 GHz radio wave, microwave and mmWave with their backhaul characteristics are discussed. Recent research on issues like beamforming, backhaul architecture, precoding and large antenna arrays, and energy efficiency for dense small cell backhaul with mmWave communications is reviewed. Recent trials of 5G technologies are summarized.Keywords: Backhaul, Small Cells, Wireless, 5G.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25061816 Grading Fourteen Zones of Isfahan in Terms of the Impact of Globalization on the Urban Fabric of the City, Using the TOPSIS Model
Authors: A. Zahedi Yeganeh, A. Khademolhosseini, R. Mokhtari Malekabadi
Abstract:
Undoubtedly one of the most far-reaching and controversial topics considered in the past few decades, has been globalization. Globalization lies in the essence of the modern culture. It is a complex and rapidly expanding network of links and mutual interdependence that is an aspect of modern life; though some argue that this link existed since the beginning of human history. If we consider globalization as a dynamic social process in which the geographical constraints governing the political, economic, social and cultural relationships have been undermined, it might not be possible to simply describe its impact on the urban fabric. But since in this phenomenon the increase in communications of societies (while preserving the main cultural - regional characteristics) with one another and the increase in the possibility of influencing other societies are discussed, the need for more studies will be felt. The main objective of this study is to grade based on some globalization factors on urban fabric applying the TOPSIS model. The research method is descriptive - analytical and survey. For data analysis, the TOPSIS model and SPSS software were used and the results of GIS software with fourteen cities are shown on the map. The results show that the process of being influenced by the globalization of the urban fabric of fourteen zones of Isfahan was not similar and there have been large differences in this respect between city zones; the most affected areas are zones 5, 6 and 9 of the municipality and the least impact has been on the zones 4 and 3 and 2.
Keywords: Grading, Globalization, Urban fabric, 14 zones of Isfahan, TOPSIS model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19881815 Comparison of different Channel Modeling Techniques used in the BPLC Systems
Authors: Justinian Anatory, Nelson Theethayi
Abstract:
The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826