Search results for: Cross-Correlation Methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4011

Search results for: Cross-Correlation Methods

2841 Integrated Subset Split for Balancing Network Utilization and Quality of Routing

Authors: S. V. Kasmir Raja, P. Herbert Raj

Abstract:

The overlay approach has been widely used by many service providers for Traffic Engineering (TE) in large Internet backbones. In the overlay approach, logical connections are set up between edge nodes to form a full mesh virtual network on top of the physical topology. IP routing is then run over the virtual network. Traffic engineering objectives are achieved through carefully routing logical connections over the physical links. Although the overlay approach has been implemented in many operational networks, it has a number of well-known scaling issues. This paper proposes a new approach to achieve traffic engineering without full-mesh overlaying with the help of integrated approach and equal subset split method. Traffic engineering needs to determine the optimal routing of traffic over the existing network infrastructure by efficiently allocating resource in order to optimize traffic performance on an IP network. Even though constraint-based routing [1] of Multi-Protocol Label Switching (MPLS) is developed to address this need, since it is not widely tested or debugged, Internet Service Providers (ISPs) resort to TE methods under Open Shortest Path First (OSPF), which is the most commonly used intra-domain routing protocol. Determining OSPF link weights for optimal network performance is an NP-hard problem. As it is not possible to solve this problem, we present a subset split method to improve the efficiency and performance by minimizing the maximum link utilization in the network via a small number of link weight modifications. The results of this method are compared against results of MPLS architecture [9] and other heuristic methods.

Keywords: Constraint based routing, Link Utilization, Subsetsplit method and Traffic Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
2840 Statistical Texture Analysis

Authors: G. N. Srinivasan, G. Shobha

Abstract:

This paper presents an overview of the methodologies and algorithms for statistical texture analysis of 2D images. Methods for digital-image texture analysis are reviewed based on available literature and research work either carried out or supervised by the authors.

Keywords: Image Texture, Texture Analysis, Statistical Approaches, Structural approaches, spectral approaches, Morphological approaches, Fractals, Fourier Transforms, Gabor Filters, Wavelet transforms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 939
2839 A Hybrid Mesh Free Local RBF- Cartesian FD Scheme for Incompressible Flow around Solid Bodies

Authors: A. Javed, K. Djidjeli, J. T. Xing, S. J. Cox

Abstract:

A method for simulating flow around the solid bodies has been presented using hybrid meshfree and mesh-based schemes. The presented scheme optimizes the computational efficiency by combining the advantages of both meshfree and mesh-based methods. In this approach, a cloud of meshfree nodes has been used in the domain around the solid body. These meshfree nodes have the ability to efficiently adapt to complex geometrical shapes. In the rest of the domain, conventional Cartesian grid has been used beyond the meshfree cloud. Complex geometrical shapes can therefore be dealt efficiently by using meshfree nodal cloud and computational efficiency is maintained through the use of conventional mesh-based scheme on Cartesian grid in the larger part of the domain. Spatial discretization of meshfree nodes has been achieved through local radial basis functions in finite difference mode (RBF-FD). Conventional finite difference scheme has been used in the Cartesian ‘meshed’ domain. Accuracy tests of the hybrid scheme have been conducted to establish the order of accuracy. Numerical tests have been performed by simulating two dimensional steady and unsteady incompressible flows around cylindrical object. Steady flow cases have been run at Reynolds numbers of 10, 20 and 40 and unsteady flow problems have been studied at Reynolds numbers of 100 and 200. Flow Parameters including lift, drag, vortex shedding, and vorticity contours are calculated. Numerical results have been found to be in good agreement with computational and experimental results available in the literature.

Keywords: CFD, Meshfree particle methods, Hybrid grid, Incompressible Navier Strokes equations, RBF-FD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2905
2838 Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model

Authors: Yepeng Cheng, Yasuhiko Morimoto

Abstract:

Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.

Keywords: Customer value, Huff's Gravity Model, POS, retailer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 612
2837 Communication Engineering Curriculum (Past, Present and the Future)

Authors: Abdurazzag Ali Aburas, Indira Rustempasic, Indira Muhic, Busra Gheith Yildiz

Abstract:

At present time, competition, unpredictable fluctuations have made communication engineering education in the global sphere really difficult. Confront with new situation in the engineering education sector. Communication engineering education has to be reformed and ready to use more advanced technologies. We realized that one of the general problems of student`s education is that after graduating from their universities, they are not prepared to face the real life challenges and full skilled to work in industry. They are prepared only to think like engineers and professionals but they also need to possess some others non-technical skills. In today-s environment, technical competence alone is not sufficient for career success. Employers want employees (graduate engineers) who have good oral and written communication (soft) skills. It does require for team work, business awareness, organization, management skills, responsibility, initiative, problem solving and IT competency. This proposed curriculum brings interactive, creative, interesting, effective learning methods, which includes online education, virtual labs, practical work, problem-based learning (PBL), and lectures given by industry experts. Giving short assignments, presentations, reports, research papers and projects students can significantly improve their non-technical skills. Also, we noticed the importance of using ICT technologies in engineering education which used by students and teachers, and included that into proposed teaching and learning methods. We added collaborative learning between students through team work which builds theirs skills besides course materials. The prospective on this research that we intent to update communication engineering curriculum in order to get fully constructed engineer students to ready for real industry work.

Keywords: communication engineering, curriculum education, ICT, industry

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
2836 Predictive Semi-Empirical NOx Model for Diesel Engine

Authors: Saurabh Sharma, Yong Sun, Bruce Vernham

Abstract:

Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model.  Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.

Keywords: Diesel engine, machine learning, NOx emission, semi-empirical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 855
2835 Design, Manufacture and Test of a Solar Powered Audible Bird Scarer

Authors: Turhan Koyuncu, Fuat Lule

Abstract:

The most common domestic birds live in Turkey are: crows (Corvus corone), pigeons (Columba livia), sparrows (Passer domesticus), starlings (Sturnus vulgaris) and blackbirds (Turdus merula). These birds give damage to the agricultural areas and make dirty the human life areas. In order to send away these birds, some different materials and methods such as chemicals, treatments, colored lights, flash and audible scarers are used. It is possible to see many studies about chemical methods in the literatures. However there is not enough works regarding audible bird scarers are reported in the literature. Therefore, a solar powered bird scarer was designed, manufactured and tested in this experimental investigation. Firstly, to understand the sensitive level of these domestic birds against to the audible scarer, many series preliminary studies were conducted. These studies showed that crows are the most resistant against to the audible bird scarer when compared with pigeons, sparrows, starlings and blackbirds. Therefore the solar powered audible bird scarer was tested on crows. The scarer was tested about one month during April- May, 2007. 18 different common known predators- sounds (voices or calls) of domestic birds from Falcon (Falco eleonorae), Falcon (Buteo lagopus), Eagle (Aquila chrysaetos), Montagu-s harrier (Circus pygargus) and Owl (Glaucidium passerinum) were selected for test of the scarer. It was seen from the results that the reaction of the birds was changed depending on the predators- sound type, camouflage of the scarer, sound quality and volume, loudspeaker play and pause periods in one application. In addition, it was also seen that the sound from Falcon (Buteo lagopus) was most effective on crows and the scarer was enough efficient.

Keywords: Bird damage, Audible scarer, Solar powered scarer, Predator sound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3670
2834 Robust Adaptive ELS-QR Algorithm for Linear Discrete Time Stochastic Systems Identification

Authors: Ginalber L. O. Serra

Abstract:

This work proposes a recursive weighted ELS algorithm for system identification by applying numerically robust orthogonal Householder transformations. The properties of the proposed algorithm show it obtains acceptable results in a noisy environment: fast convergence and asymptotically unbiased estimates. Comparative analysis with others robust methods well known from literature are also presented.

Keywords: Stochastic Systems, Robust Identification, Parameter Estimation, Systems Identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
2833 Optimization by Ant Colony Hybryde for the Bin-Packing Problem

Authors: Ben Mohamed Ahemed Mohamed, Yassine Adnan

Abstract:

The problem of bin-packing in two dimensions (2BP) consists in placing a given set of rectangular items in a minimum number of rectangular and identical containers, called bins. This article treats the case of objects with a free orientation of 90Ôùª. We propose an approach of resolution combining optimization by colony of ants (ACO) and the heuristic method IMA to resolve this NP-Hard problem.

Keywords: Ant colony algorithm, bin-packing problem, heuristics methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1843
2832 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: Blueprint, ERP, SDLC, Modular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 396
2831 Exploration of Influential Factors on First Year Architecture Students’ Productivity

Authors: Shima Nikanjam, Badiossadat Hassanpour, Adi Irfan Che Ani

Abstract:

The design process in architecture education is based upon the Learning-by-Doing method, which leads students to understand how to design by practicing rather than studying. First-year design studios, as starting educational stage, provide integrated knowledge and skills of design for newly jointed architecture students. Within the basic design studio environment, students are guided to transfer their abstract thoughts into visual concrete decisions under the supervision of design educators for the first time. Therefore, introductory design studios have predominant impacts on students’ operational thinking and designing. Architectural design thinking is quite different from students’ educational backgrounds and learning habits. This educational challenge at basic design studios creates a severe need to study the reality of design education at foundation year and define appropriate educational methods with convenient project types with the intention of enhancing architecture education quality. Material for this study has been gathered through long-term direct observation at a first year second semester design studio at the faculty of architecture at EMU (known as FARC 102), fall and spring academic semester 2014-15. Distribution of a questionnaire among case study students and interviews with third and fourth design studio students who passed through the same methods of education in the past 2 years and conducting interviews with instructors are other methodologies used in this research. The results of this study reveal a risk of a mismatch between the implemented teaching method, project type and scale in this particular level and students’ learning styles. Although the existence of such risk due to varieties in students’ profiles could be expected to some extent, recommendations can support educators to reach maximum compatibility.

Keywords: Architecture education, basic design studio, educational method, forms creation skill.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
2830 The Necessity of Biomass Application for Developing Combined Heat and Power(CHP) with Biogas Fuel: Case Study

Authors: Farnaz Amin Salehi, David Edward.Cotton, Mohammad Ali Abdoli, Kambiz Rezapour

Abstract:

The daily increase of organic waste materials resulting from different activities in the country is one of the main factors for the pollution of environment. Today, with regard to the low level of the output of using traditional methods, the high cost of disposal waste materials and environmental pollutions, the use of modern methods such as anaerobic digestion for the production of biogas has been prevailing. The collected biogas from the process of anaerobic digestion, as a renewable energy source similar to natural gas but with a less methane and heating value is usable. Today, with the help of technologies of filtration and proper preparation, access to biogas with features fully similar to natural gas has become possible. At present biogas is one of the main sources of supplying electrical and thermal energy and also an appropriate option to be used in four stroke engine, diesel engine, sterling engine, gas turbine, gas micro turbine and fuel cell to produce electricity. The use of biogas for different reasons which returns to socio-economic and environmental advantages has been noticed in CHP for the production of energy in the world. The production of biogas from the technology of anaerobic digestion and its application in CHP power plants in Iran can not only supply part of the energy demands in the country, but it can materialize moving in line with the sustainable development. In this article, the necessity of the development of CHP plants with biogas fuels in the country will be dealt based on studies performed from the economic, environmental and social aspects. Also to prove the importance of the establishment of these kinds of power plants from the economic point of view, necessary calculations has been done as a case study for a CHP power plant with a biogas fuel.

Keywords: Anaerobic Digestion, Biogas, CHP, Organic Wastes

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
2829 Derivation of Monotone Likelihood Ratio Using Two Sided Uniformly Normal Distribution Techniques

Authors: D. A. Farinde

Abstract:

In this paper, two-sided uniformly normal distribution techniques were used in the derivation of monotone likelihood ratio. The approach mainly employed the parameters of the distribution for a class of all size a. The derivation technique is fast, direct and less burdensome when compared to some existing methods.

Keywords: Neyman-Pearson Lemma, Normal distribution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3202
2828 Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

Authors: Ladan Darougaran, Hossein Shahinzadeh, Hajar Ghotb, Leila Ramezanpour

Abstract:

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

Keywords: Data aggregation, wireless sensor networks, energy efficiency, simulated annealing algorithm, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1683
2827 Ecosystem Model for Environmental Applications

Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru

Abstract:

This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision –making.

Keywords: Ecosystem model, Environmental security, Fuzzy logic, Sustainability of habitable regions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1982
2826 Uncontrollable Inaccuracy in Inverse Problems

Authors: Yu. Menshikov

Abstract:

In this paper the influence of errors of function derivatives in initial time which have been obtained by experiment (uncontrollable inaccuracy) to the results of inverse problem solution was investigated. It was shown that these errors distort the inverse problem solution as a rule near the beginning of interval where the solutions are analyzed. Several methods for removing the influence of uncontrollable inaccuracy have been suggested. 

Keywords: Inverse problems, uncontrollable inaccuracy, filtration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1170
2825 A New Distribution Network Reconfiguration Approach using a Tree Model

Authors: E. Dolatdar, S. Soleymani, B. Mozafari

Abstract:

Power loss reduction is one of the main targets in power industry and so in this paper, the problem of finding the optimal configuration of a radial distribution system for loss reduction is considered. Optimal reconfiguration involves the selection of the best set of branches to be opened ,one each from each loop, for reducing resistive line losses , and reliving overloads on feeders by shifting the load to adjacent feeders. However ,since there are many candidate switching combinations in the system ,the feeder reconfiguration is a complicated problem. In this paper a new approach is proposed based on a simple optimum loss calculation by determining optimal trees of the given network. From graph theory a distribution network can be represented with a graph that consists a set of nodes and branches. In fact this problem can be viewed as a problem of determining an optimal tree of the graph which simultaneously ensure radial structure of each candidate topology .In this method the refined genetic algorithm is also set up and some improvements of algorithm are made on chromosome coding. In this paper an implementation of the algorithm presented by [7] is applied by modifying in load flow program and a comparison of this method with the proposed method is employed. In [7] an algorithm is proposed that the choice of the switches to be opened is based on simple heuristic rules. This algorithm reduce the number of load flow runs and also reduce the switching combinations to a fewer number and gives the optimum solution. To demonstrate the validity of these methods computer simulations with PSAT and MATLAB programs are carried out on 33-bus test system. The results show that the performance of the proposed method is better than [7] method and also other methods.

Keywords: Distribution System, Reconfiguration, Loss Reduction , Graph Theory , Optimization , Genetic Algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3782
2824 Influencing Attitude Change for Sustainability through Persuasion

Authors: Chia-Hsin Wu, Chen-Hao Wuang, Yu-Hung Chou, Chia-Chih Chen, Pei-Ju Chen, Hsiao-Chen You, Yi-Shin Deng

Abstract:

Food mileage is one of the important issues concerning environmental sustainability. In this research we have utilized a prototype platform with iterative user-centered testing. With these findings we successfully demonstrate the use of the context of persuasive methods to influence users- attitudes towards the sustainable concept.

Keywords: Behavior change, food mileage, persuasive technology, sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
2823 A Formal Implementation of Database Security

Authors: Yun Bai

Abstract:

This paper is to investigate the impplementation of security mechanism in object oriented database system. Formal methods plays an essential role in computer security due to its powerful expressiveness and concise syntax and semantics. In this paper, both issues of specification and implementation in database security environment will be considered; and the database security is achieved through the development of an efficient implementation of the specification without compromising its originality and expressiveness.

Keywords: database security, authorization policy, logic basedspecification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718
2822 The Legal Procedure of Attestation of Public Servants

Authors: Armen Yezekyan

Abstract:

The main purpose of this research is to comprehensively explore and identify the problems of attestation of the public servants and to propose solutions for these issues through deeply analyzing laws and the legal theoretical literature. For the detailed analysis of the above-mentioned problems we will use some research methods, the implementation of which has a goal to ensure the objectivity and clarity of scientific research and its results.

Keywords: Attestation, attestation commission, competition commission, public servant, public service, testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2920
2821 Carbamazepine Co-crystal Screening with Dicarboxylic Acids Co-Crystal Formers

Authors: Syarifah Abd Rahim, Fatinah Ab Rahman, Engku N. E. M. Nasir, Noor A. Ramle

Abstract:

Co-crystal is believed to improve the solubility and dissolution rates and thus, enhanced the bioavailability of poor water soluble drugs particularly during the oral route of administration. With the existing of poorly soluble drugs in pharmaceutical industry, the screening of co-crystal formation using carbamazepine (CBZ) as a model drug compound with dicarboxylic acids co-crystal formers (CCF) namely fumaric (FA) and succinic (SA) acids in ethanol has been studied. The co-crystal formations were studied by varying the mol ratio values of CCF to CBZ to access the effect of CCF concentration on the formation of the co-crystal. Solvent evaporation, slurry and cooling crystallization which representing the solution based method co-crystal screening were used. Based on the differential scanning calorimetry (DSC) analysis, the melting point of CBZ-SA in different ratio was in the range between 188oC-189oC. For CBZ-FA form A and CBZ-FA form B the melting point in different ratio were in the range of 174oC-175oC and 185oC-186oC respectively. The product crystal from the screening was also characterized using X-ray powder diffraction (XRPD). The XRPD pattern profile analysis has shown that the CBZ co-crystals with FA and SA were successfully formed for all ratios studied. The findings revealed that CBZ-FA co-crystal were formed in two different polymorphs. It was found that CBZ-FA form A and form B were formed from evaporation and slurry crystallization methods respectively. On the other hand, in cooling crystallization method, CBZ-FA form A was formed at lower mol ratio of CCF to CBZ and vice versa. This study disclosed that different methods and mol ratios during the co-crystal screening can affect the outcome of co-crystal produced such as polymorphic forms of co-crystal and thereof. Thus, it was suggested that careful attentions is needed during the screening since the co-crystal formation is currently one of the promising approach to be considered in research and development for pharmaceutical industry to improve the poorly soluble drugs.

Keywords: Carbamazepine, co-crystal, co-crystal former, dicarboxylic acid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2909
2820 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters

Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev

Abstract:

Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.

Keywords: Lexicon of disasters, modelling, Petri nets, text annotation, social disasters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1157
2819 Continuous Threshold Prey Harvesting in Predator-Prey Models

Authors: Jonathan Bohn, Jorge Rebaza, Kaitlin Speer

Abstract:

The dynamics of a predator-prey model with continuous threshold policy harvesting functions on the prey is studied. Theoretical and numerical methods are used to investigate boundedness of solutions, existence of bionomic equilibria, and the stability properties of coexistence equilibrium points and periodic orbits. Several bifurcations as well as some heteroclinic orbits are computed.

Keywords: Predator-prey models, threshold harvesting, dynamicalsystems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2304
2818 Formal Models of Sanitary Inspections Teams Activities

Authors: Tadeusz Nowicki, Radosław Pytlak, Robert Waszkowski, Jerzy Bertrandt, Anna Kłos

Abstract:

This paper presents methods for formal modeling of activities in the area of sanitary inspectors outbreak of food-borne diseases. The models allow you to measure the characteristics of the activities of sanitary inspection and as a result allow improving the performance of sanitary services and thus food security.

Keywords: Food-borne disease, epidemic, sanitary inspection, mathematical models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2010
2817 Preparation, Characterisation, and Measurement of the in vitro Cytotoxicity of Mesoporous Silica Nanoparticles Loaded with Cytotoxic Pt(II) Oxadiazoline Complexes

Authors: G. Wagner, R. Herrmann

Abstract:

Cytotoxic platinum compounds play a major role in the chemotherapy of a large number of human cancers. However, due to the severe side effects for the patient and other problems associated with their use, there is a need for the development of more efficient drugs and new methods for their selective delivery to the tumours. One way to achieve the latter could be in the use of nanoparticular substrates that can adsorb or chemically bind the drug. In the cell, the drug is supposed to be slowly released, either by physical desorption or by dissolution of the particle framework. Ideally, the cytotoxic properties of the platinum drug unfold only then, in the cancer cell and over a longer period of time due to the gradual release. In this paper, we report on our first steps in this direction. The binding properties of a series of cytotoxic Pt(II) oxadiazoline compounds to mesoporous silica particles has been studied by NMR and UV/vis spectroscopy. High loadings were achieved when the Pt(II) compound was relatively polar, and has been dissolved in a relatively nonpolar solvent before the silica was added. Typically, 6-10 hours were required for complete equilibration, suggesting the adsorption did not only occur to the outer surface but also to the interior of the pores. The untreated and Pt(II) loaded particles were characterised by C, H, N combustion analysis, BET/BJH nitrogen sorption, electron microscopy (REM and TEM) and EDX. With the latter methods we were able to demonstrate the homogenous distribution of the Pt(II) compound on and in the silica particles, and no Pt(II) bulk precipitate had formed. The in vitro cytotoxicity in a human cancer cell line (HeLa) has been determined for one of the new platinum compounds adsorbed to mesoporous silica particles of different size, and compared with the corresponding compound in solution. The IC50 data are similar in all cases, suggesting that the release of the Pt(II) compound was relatively fast and possibly occurred before the particles reached the cells. Overall, the platinum drug is chemically stable on silica and retained its activity upon prolonged storage.

Keywords: Cytotoxicity, mesoporous silica, nanoparticles platinum compounds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643
2816 Experimental Investigation of Chatter Vibrations in Facing and Turning Processes

Authors: M. Siddhpura, R. Paurobally

Abstract:

This paper investigates the occurrence of regenerative chatter vibrations in facing and turning processes. Orthogonal turning (facing) and normal turning experiments are carried out under stable as well as in the presence of controlled chatter vibrations. The effects of chatter vibrations on various sensor signals are captured and analyzed using frequency domain methods, which successfully detected the chatter vibrations close to the dominant mode of the machine tool system.

Keywords: Chatter vibrations, facing, turning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3570
2815 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures

Authors: Salima Kouici, Abdelkader Khelladi

Abstract:

In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.

Keywords: Binary data, similarity measure, Tθ measures, Agglomerative Hierarchical Clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3446
2814 Design and Analysis of a New Mini-Bike Prototype Using Fabrication Techniques

Authors: S. A Puviyarasu, V. S. Ukkeshwar

Abstract:

Elicitation of creative conceptual designing and fabrication of mini bikes is the primary aim of this study. Miniature bikes or pit bikes or simply mini bikes are found to be the recently prevalent trendsetters amongst the younger population around the globe, be it for commuting and sports. This study also focuses on the steps to be put forth in building a self-designed mini bike concept and showcases similar instances.

Keywords: Miniature bikes, design methods, creative styling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
2813 Inheritance Growth: a Biology Inspired Method to Build Structures in P2P

Authors: Panchalee Sukjit, Herwig Unger

Abstract:

IT infrastructures are becoming more and more difficult. Therefore, in the first industrial IT systems, the P2P paradigm has replaced the traditional client server and methods of self-organization are gaining more and more importance. From the past it is known that especially regular structures like grids may significantly improve the system behavior and performance. This contribution introduces a new algorithm based on a biologic analogue, which may provide the growth of several regular structures on top of anarchic grown P2P- or social network structures.

Keywords: P2P, Pattern generation, Grid, Social network, Inheritance, Reproduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
2812 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru

Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar

Abstract:

Nowadays, Heritage Building Information Modeling (HBIM) is considered an efficient tool to represent and manage information of Cultural Heritage (CH). The basis of this tool relies on a 3D model generally obtained from a Cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired Level of Development (LOD), Level of Information (LOI), Grade of Generation (GOG) as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models’ families respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources, since the BIM software used has a free student license.

Keywords: Cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927