Search results for: proposed module
2619 Simple Infrastructure in Measuring Countries e-Government
Authors: Sukhbaatar Dorj, Erdenebaatar Altangerel
Abstract:
As alternative to existing e-government measuring models, here proposed a new customer centric, service oriented, simple approach for measuring countries e-Governments. If successfully implemented, built infrastructure will provide a single e-government index number for countries. Main schema is as follows. Country CIO or equal position government official, at the beginning of each year will provide to United Nations dedicated web site 4 numbers on behalf of own country: 1) Ratio of available online public services, to total number of public services, 2) Ratio of interagency inter ministry online public services to total number of available online public services, 3) Ratio of total number of citizen and business entities served online annually to total number of citizen and business entities served annually online and physically on those services, 4) Simple index for geographical spread of online served citizen and business entities. 4 numbers then combined into one index number by mathematical Average function. In addition to 4 numbers 5th number can be introduced as service quality indicator of online public services. If in ordering of countries index number is equal, 5th criteria will be used. Notice: This approach is for country’s current e-government achievement assessment, not for e-government readiness assessment.Keywords: countries e-government index, e-government, infrastructure for measuring e-government, measuring e-government
Procedia PDF Downloads 3282618 Lockit: A Logic Locking Automation Software
Authors: Nemanja Kajtez, Yue Zhan, Basel Halak
Abstract:
The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).Keywords: design automation, hardware security, IP piracy, logic locking
Procedia PDF Downloads 1802617 Maximisation of Consumer Welfare in the Enforcement of Intellectual Property Rights in Competition Guidelines: The Malaysian Experience
Authors: Ida Madieha Abdul Ghani Azmi, Heng Gee Lim, Adlan Abdul Razak, Nasaruddin Abdul Rahman
Abstract:
The objective of competition law is to maximise consumer welfare through the regulation of anti-competitive behaviour that results in the distortion of the market. Intellectual property law also seeks to enhance consumer welfare in the long run by encouraging the development of useful devices and processes. Nevertheless, in some circumstances, the IP owners behave in such a way that makes it difficult for rival companies to sell substitute products and technology in the market. Intellectual property owners may also reach a dominant position in the market such that they are able to dictate unfair terms and conditions on other market players. Among the two major categories of anti-competitive behavior is the use of horizontal and vertical agreement to constrain effective competition and abuse of dominant position. As a result, many countries have regulated the conduct of the IP owners that are considered as anti-competitive including the US, Canada, and Singapore. This paper visits the proposed IP Guidelines recently drafted by the Malaysian Competition Commission and investigates to what extent it resolves most of the anti-competitive behavior of the IP owners. The paper concludes by suggesting some of the rules that could be prescribed by the Competition Commission in order to maintain the relevancy of competition law as the main check against the abuse of rights by the intellectual property owners.Keywords: abuse of dominant position, consumer welfare, intellectual property rights, vertical and horizontal agreements
Procedia PDF Downloads 2162616 Water-Bentonite Interaction of Green Pellets through Micro-Structural Analysis
Authors: Satyananda Patra, Venugopal Rayasam
Abstract:
The quality of pellets produced is affected by quality and type of green pellets, amount of addition of binders and fluxing agents along with the provided firing conditions. The green pellet quality depends upon chemistry, mineralogy and granulometry of fines used for pellet making, the feed size, its moisture content and porosity. During firing of green pellets, ingredients present within reacts to form different phases and microstructure. So in turn, physical and metallurgical properties of pellets are influenced by amount and type of binder and flux addition, induration time and temperature. During iron making process, the metallurgical properties of fired pellets are decided by the type and amount of these phases and their chemistry. Green pelletizing and induration studies have been already carried out with magnetite and hematite ore fines but for Indian iron ores of high alumina content showing different pelletizing characters, these studies cannot be directly interpreted. The main objective of proposed research work is to understand the green pelletizing process and determine the water bentonite interaction at different levels. Swelling behavior of bentonite and microstructure of the green pellet are investigated. Conversion of iron ore fines into pellets, the key raw material and process variables that influence the pellet quality needs to be identified and a correlation should be established between them.Keywords: iron ore, pelletization, binders, green pellets, microstructure
Procedia PDF Downloads 3082615 Tide Contribution in the Flood Event of Jeddah City: Mathematical Modelling and Different Field Measurements of the Groundwater Rise
Authors: Aïssa Rezzoug
Abstract:
This paper is aimed to bring new elements that demonstrate the tide caused the groundwater to rise in the shoreline band, on which the urban areas occurs, especially in the western coastal cities of the Kingdom of Saudi Arabia like Jeddah. The reason for the last events of Jeddah inundation was the groundwater rise in the city coupled at the same time to a strong precipitation event. This paper will illustrate the tide participation in increasing the groundwater level significantly. It shows that the reason for internal groundwater recharge within the urban area is not only the excess of the water supply coming from surrounding areas, due to the human activity, with lack of sufficient and efficient sewage system, but also due to tide effect. The research study follows a quantitative method to assess groundwater level rise risks through many in-situ measurements and mathematical modelling. The proposed approach highlights groundwater level, in the urban areas of the city on the shoreline band, reaching the high tide level without considering any input from precipitation. Despite the small tide in the Red Sea compared to other oceanic coasts, the groundwater level is considerably enhanced by the tide from the seaside and by the freshwater table from the landside of the city. In these conditions, the groundwater level becomes high in the city and prevents the soil to evacuate quickly enough the surface flow caused by the storm event, as it was observed in the last historical flood catastrophe of Jeddah in 2009.Keywords: flood, groundwater rise, Jeddah, tide
Procedia PDF Downloads 1122614 Irrigation Scheduling for Wheat in Bangladesh under Water Stress Conditions Using Water Productivity Model
Authors: S. M. T. Mustafa, D. Raes, M. Huysmans
Abstract:
Proper utilization of water resource is very important in agro-based Bangladesh. Irrigation schedule based on local environmental conditions, soil type and water availability will allow a sustainable use of water resources in agriculture. In this study, the FAO crop water model (AquaCrop) was used to simulate the different water and fertilizer management strategies in different location of Bangladesh to obtain a management guideline for the farmer. Model was calibrated and validated for wheat (Triticum aestivum L.). The statistical indices between the observed and simulated grain yields obtained were very good with R2, RMSE, and EF values of 0.92, 0.33, and 0.83, respectively for model calibration and 0.92, 0.68 and 0.77, respectively for model validations. Stem elongation (jointing) to booting and flowering stage were identified as most water sensitive for wheat. Deficit irrigation on water sensitive stage could increase the grain yield for increasing soil fertility levels both for loamy and sandy type soils. Deficit irrigation strategies provides higher water productivity than full irrigation strategies and increase the yield stability (reduce the standard deviation). The practical deficit irrigation schedule for wheat for four different stations and two different soils were designed. Farmer can produce more crops by using deficit irrigation schedule under water stress condition. Practical application and validation of proposed strategies will make them more credible.Keywords: crop-water model, deficit irrigation, irrigation scheduling, wheat
Procedia PDF Downloads 4302613 Space Debris Mitigation: Solutions from the Dark Skies of the Remote Australian Outback Using a Proposed Network of Mobile Astronomical Observatories
Authors: Muhammad Akbar Hussain, Muhammad Mehdi Hussain, Waqar Haider
Abstract:
There are tens of thousands of undetected and uncatalogued pieces of space debris in the Low Earth Orbit (LEO). They are not only difficult to be detected and tracked, their sheer number puts active satellites and humans in orbit around Earth into danger. With the entry of more governments and private companies into harnessing the Earth’s orbit for communication, research and military purposes, there is an ever-increasing need for not only the detection and cataloguing of these pieces of space debris, it is time to take measures to take them out and clean up the space around Earth. Current optical and radar-based Space Situational Awareness initiatives are useful mostly in detecting and cataloguing larger pieces of debris mainly for avoidance measures. Smaller than 10 cm pieces are in a relatively dark zone, yet these are deadly and capable of destroying satellites and human missions. A network of mobile observatories, connected to each other in real time and working in unison as a single instrument, may be able to detect small pieces of debris and achieve effective triangulation to help create a comprehensive database of their trajectories and parameters to the highest level of precision. This data may enable ground-based laser systems to help deorbit individual debris. Such a network of observatories can join current efforts in detection and removal of space debris in Earth’s orbit.Keywords: space debris, low earth orbit, mobile observatories, triangulation, seamless operability
Procedia PDF Downloads 1642612 Perfectly Matched Layer Boundary Stabilized Using Multiaxial Stretching Functions
Authors: Adriano Trono, Federico Pinto, Diego Turello, Marcelo A. Ceballos
Abstract:
Numerical modeling of dynamic soil-structure interaction problems requires an adequate representation of the unbounded characteristics of the ground, material non-linearity of soils, and geometrical non-linearities such as large displacements due to rocking of the structure. In order to account for these effects simultaneously, it is often required that the equations of motion are solved in the time domain. However, boundary conditions in conventional finite element codes generally present shortcomings in fully absorbing the energy of outgoing waves. In this sense, the Perfectly Matched Layers (PML) technique allows a satisfactory absorption of inclined body waves, as well as surface waves. However, the PML domain is inherently unstable, meaning that it its instability does not depend upon the discretization considered. One way to stabilize the PML domain is to use multiaxial stretching functions. This development is questionable because some Jacobian terms of the coordinate transformation are not accounted for. For this reason, the resulting absorbing layer element is often referred to as "uncorrected M-PML” in the literature. In this work, the strong formulation of the "corrected M-PML” absorbing layer is proposed using multiaxial stretching functions that incorporate all terms of the coordinate transformation. The results of the stable model are compared with reference solutions obtained from extended domain models.Keywords: mixed finite elements, multiaxial stretching functions, perfectly matched layer, soil-structure interaction
Procedia PDF Downloads 672611 Understanding the Complexities of Consumer Financial Spinning
Authors: Olivier Mesly
Abstract:
This research presents a conceptual framework termed “Consumer Financial Spinning” (CFS) to analyze consumer behavior in the financial/economic markets. This phenomenon occurs when consumers of high-stakes financial products accumulate unsustainable debt, leading them to detach from their initial financial hierarchy of needs, wealth-related goals, and preferences regarding their household portfolio of assets. The daring actions of these consumers, forming a dark financial triangle, are characterized by three behaviors: overconfidence, the use of rationed rationality, and deceitfulness. We show that we can incorporate CFS into the traditional CAPM and Markovitz’ portfolio optimization models to create a framework that explains such market phenomena as the global financial crisis, highlighting the antecedents and consequences of ill-conceived speculation. Because this is a conceptual paper, there is no methodology with respect to ground studies. However, we apply modeling principles derived from the data percolation methodology, which contains tenets explicating how to structure concepts. A simulation test of the proposed framework is conducted; it demonstrates the conditions under which the relationship between expected returns and risk may deviate from linearity. The analysis and conceptual findings are particularly relevant both theoretically and pragmatically as they shed light on the psychological conditions that drive intense speculation, which can lead to market turmoil. Armed with such understanding, regulators are better equipped to propose solutions before the economic problems become out of control.Keywords: consumer financial spinning, rationality, deceitfulness, overconfidence, CAPM
Procedia PDF Downloads 462610 Degradation of Acetaminophen with Fe3O4 and Fe2+ as Activator of Peroxymonosulfate
Authors: Chaoqun Tan, Naiyun Gao, Xiaoyan Xin
Abstract:
Perxymonosulfate (PMS)-based oxidation processes, as an alternative of hydrogen peroxide-based oxidation processes, are more and more popular because of reactive radical species (SO4-•, OH•) produced in systems. Magnetic nano-scaled particles Fe3O4 and ferrous anion (Fe2+) were studied for the activation of PMS for degradation of acetaminophen (APAP) in water. The Fe3O4 MNPs were found to effectively catalyze PMS for APAP and the reactions well followed a pseudo-first-order kinetics pattern (R2 > 0.95), while the degradation of APAP in PMS-Fe2+ system proceeds through two stages: a fast stage and a much slower stage. Within 5 min, approximately 7% and 18% of 10 ppm APAP was accomplished by 0.2 mM PMS in Fe3O4 (0.8g/L) and Fe2+ (0.1mM) activation process. However, as reaction proceed to 120 min, approximately 75% and 35% of APAP was removed in Fe3O4 activation process and Fe2+ activation process, respectively. Within 120 min, the mineralization of APAP was about 7.5% and 5.0% (initial APAP of 10 ppm and [PMS]0 of 0.2 mM) in Fe3O4-PMS and Fe2+-PMS system, while the mineralization could be greatly increased to about 31% and 40% as [PMS]0 increased to 2.0 mM in in Fe3O4-PMS and Fe2+-PMS system, respectively. At last, the production of reactive radical species were validated directly from Electron Paramagnetic Resonance (ESR) tests with 0.1 M 5,5-dimethyl-1-pyrrolidine N-oxide (DMPO). Plausible mechanisms on the radical generation from Fe3O4 and Fe2+ activation of PMS are proposed on the results of radial identification tests. The results demonstrated that Fe3O4 MNPs activated PMS and Fe2+ anion activated PMS systems are promising technologies for water pollution caused by contaminants such as pharmaceutical. Fe3O4-PMS system is more suitable for slowly remediation, while Fe2+-PMS system is more suitable for fast remediation.Keywords: acetaminophen, peroxymonosulfate, radicals, Fe3O4
Procedia PDF Downloads 2542609 Sustainability in Hospitality: An Inevitable Necessity in New Age with Big Environmental Challenges
Authors: Majid Alizadeh, Sina Nematizadeh, Hassan Esmailpour
Abstract:
The mutual effects of hospitality and the environment are undeniable, so that the tourism industry has major harmful effects on the environment. Hotels, as one of the most important pillars of the hospitality industry, have significant effects on the environment. Green marketing is a promising strategy in response to the growing concerns about the environment. A green hotel marketing model was proposed using a grounded theory approach in the hotel industry. The study was carried out as a mixed method study. Data gathering in the qualitative phase was done through literature review and In-depth, semi-structured interviews with 10 experts in green marketing using snowball technique. Following primary analysis, open, axial, and selective coding was done on the data, which yielded 69 concepts, 18 categories and six dimensions. Green hotel (green product) was adopted as the core phenomenon. In the quantitative phase, data were gleaned using 384 questionnaires filled-out by hotel guests and descriptive statistics and Structural equation modeling (SEM) were used for data analysis. The results indicated that the mediating role of behavioral response between the ecological literacy, trust, marketing mix and performance was significant. The green marketing mix, as a strategy, had a significant and positive effect on guests’ behavioral response, corporate green image, and financial and environmental performance of hotels.Keywords: green marketing, sustainable development, hospitality, grounded theory, structural equations model
Procedia PDF Downloads 812608 Identifying and Ranking Environmental Risks of Oil and Gas Projects Using the VIKOR Method for Multi-Criteria Decision Making
Authors: Sasan Aryaee, Mahdi Ravanshadnia
Abstract:
Naturally, any activity is associated with risk, and humans have understood this concept from very long times ago and seek to identify its factors and sources. On the one hand, proper risk management can cause problems such as delays and unforeseen costs in the development projects, temporary or permanent loss of services, getting lost or information theft, complexity and limitations in processes, unreliable information caused by rework, holes in the systems and many such problems. In the present study, a model has been presented to rank the environmental risks of oil and gas projects. The statistical population of the study consists of all executives active in the oil and gas fields, that the statistical sample is selected randomly. In the framework of the proposed method, environmental risks of oil and gas projects were first extracted, then a questionnaire based on these indicators was designed based on Likert scale and distributed among the statistical sample. After assessing the validity and reliability of the questionnaire, environmental risks of oil and gas projects were ranked using the VIKOR method of multiple-criteria decision-making. The results showed that the best options for HSE planning of oil and gas projects that caused the reduction of risks and personal injury and casualties and less than other options is costly for the project and it will add less time to the duration of implementing the project is the entering of dye to the environment when painting the generator pond and the presence of the rigger near the crane.Keywords: ranking, multi-criteria decision making, oil and gas projects, HSEmanagement, environmental risks
Procedia PDF Downloads 1562607 Encoded Fiber Optic Sensors for Simultaneous Multipoint Sensing
Authors: C. Babu Rao, Pandian Chelliah
Abstract:
Owing to their reliability, a number of fluorescent spectra based fiber optic sensors have been developed for detection and identification of hazardous chemicals such as explosives, narcotics etc. In High security regions, such as airports, it is important to monitor simultaneously multiple locations. This calls for deployment of a portable sensor at each location. However, the selectivity and sensitivity of these techniques depends on the spectral resolution of the spectral analyzer. The better the resolution the larger the repertoire of chemicals that can be detected. A portable unit will have limitations in meeting these requirements. Optical fibers can be employed for collecting and transmitting spectral signal from the portable sensor head to a sensitive central spectral analyzer (CSA). For multipoint sensing, optical multiplexing of multiple sensor heads with CSA has to be adopted. However with multiplexing, when one sensor head is connected to CSA, the rest may remain unconnected for the turn-around period. The larger the number of sensor heads the larger this turn-around time will be. To circumvent this imitation, we propose in this paper, an optical encoding methodology to use multiple portable sensor heads connected to a single CSA. Each portable sensor head is assigned an unique address. Spectra of every chemical detected through this sensor head, are encoded by its unique address and can be identified at the CSA end. The methodology proposed is demonstrated through a simulation using Matlab SIMULINK.Keywords: optical encoding, fluorescence, multipoint sensing
Procedia PDF Downloads 7082606 The Notion of International Criminal Law: Between Criminal Aspects of International Law and International Aspects of Criminal Law
Authors: Magda Olesiuk-Okomska
Abstract:
Although international criminal law has grown significantly in the last decades, it still remains fragmented and lacks doctrinal cohesiveness. Its concept is described in the doctrine as highly disputable. There is no concrete definition of the term. In the domestic doctrine, the problem of criminal law issues that arise in the international setting, and international issues that arise within the national criminal law, is underdeveloped both theoretically and practically. To the best of author’s knowledge, there are no studies describing international aspects of criminal law in a comprehensive manner, taking a more expansive view of the subject. This paper presents results of a part of the doctoral research, undertaking a theoretical framework of the international criminal law. It aims at sorting out the existing terminology on international aspects of criminal law. It demonstrates differences between the notions of international criminal law, criminal law international and law international criminal. It confronts the notion of criminal law with related disciplines and shows their interplay. It specifies the scope of international criminal law. It diagnoses the current legal framework of international aspects of criminal law, referring to both criminal law issues that arise in the international setting, and international issues that arise in the context of national criminal law. Finally, de lege lata postulates were formulated and direction of changes in international criminal law was proposed. The adopted research hypothesis assumed that the notion of international criminal law was inconsistent, not understood uniformly, and there was no conformity as to its place within the system of law, objective and subjective scopes, while the domestic doctrine did not correspond with international standards and differed from the worldwide doctrine. Implemented research methods included inter alia a dogmatic and legal method, an analytical method, a comparative method, as well as desk research.Keywords: criminal law, international crimes, international criminal law, international law
Procedia PDF Downloads 2992605 Risk Assessment of Heavy Rainfall and Development of Damage Prediction Function for Gyeonggi-Do Province
Authors: Jongsung Kim, Daegun Han, Myungjin Lee, Soojun Kim, Hung Soo Kim
Abstract:
Recently, the frequency and magnitude of natural disasters are gradually increasing due to climate change. Especially in Korea, large-scale damage caused by heavy rainfall frequently occurs due to rapid urbanization. Therefore, this study proposed a Heavy rain Damage Risk Index (HDRI) using PSR (Pressure – State - Response) structure for heavy rain risk assessment. We constructed pressure index, state index, and response index for the risk assessment of each local government in Gyeonggi-do province, and the evaluation indices were determined by principal component analysis. The indices were standardized using the Z-score method then HDRIs were obtained for 31 local governments in the province. The HDRI is categorized into three classes, say, the safest class is 1st class. As the results, the local governments of the 1st class were 15, 2nd class 7, and 3rd class 9. From the study, we were able to identify the risk class due to the heavy rainfall for each local government. It will be useful to develop the heavy rainfall prediction function by risk class, and this was performed in this issue. Also, this risk class could be used for the decision making for efficient disaster management. Acknowledgements: This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2017R1A2B3005695).Keywords: natural disaster, heavy rain risk assessment, HDRI, PSR
Procedia PDF Downloads 1982604 Improving Topic Quality of Scripts by Using Scene Similarity Based Word Co-Occurrence
Authors: Yunseok Noh, Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park
Abstract:
Scripts are one of the basic text resources to understand broadcasting contents. Since broadcast media wields lots of influence over the public, tools for understanding broadcasting contents are more required. Topic modeling is the method to get the summary of the broadcasting contents from its scripts. Generally, scripts represent contents descriptively with directions and speeches. Scripts also provide scene segments that can be seen as semantic units. Therefore, a script can be topic modeled by treating a scene segment as a document. Because scripts consist of speeches mainly, however, relatively small co-occurrences among words in the scene segments are observed. This causes inevitably the bad quality of topics based on statistical learning method. To tackle this problem, we propose a method of learning with additional word co-occurrence information obtained using scene similarities. The main idea of improving topic quality is that the information that two or more texts are topically related can be useful to learn high quality of topics. In addition, by using high quality of topics, we can get information more accurate whether two texts are related or not. In this paper, we regard two scene segments are related if their topical similarity is high enough. We also consider that words are co-occurred if they are in topically related scene segments together. In the experiments, we showed the proposed method generates a higher quality of topics from Korean drama scripts than the baselines.Keywords: broadcasting contents, scripts, text similarity, topic model
Procedia PDF Downloads 3182603 Signal Amplification Using Graphene Oxide in Label Free Biosensor for Pathogen Detection
Authors: Agampodi Promoda Perera, Yong Shin, Mi Kyoung Park
Abstract:
The successful detection of pathogenic bacteria in blood provides important information for early detection, diagnosis and the prevention and treatment of infectious diseases. Silicon microring resonators are refractive-index-based optical biosensors that provide highly sensitive, label-free, real-time multiplexed detection of biomolecules. We demonstrate the technique of using GO (graphene oxide) to enhance the signal output of the silicon microring optical sensor. The activated carboxylic groups in GO molecules bind directly to single stranded DNA with an amino modified 5’ end. This conjugation amplifies the shift in resonant wavelength in a real-time manner. We designed a capture probe for strain Staphylococcus aureus of 21 bp and a longer complementary target sequence of 70 bp. The mismatched target sequence we used was of Streptococcus agalactiae of 70 bp. GO is added after the complementary binding of the probe and target. GO conjugates to the unbound single stranded segment of the target and increase the wavelength shift on the silicon microring resonator. Furthermore, our results show that GO could successfully differentiate between the mismatched DNA sequences from the complementary DNA sequence. Therefore, the proposed concept could effectively enhance sensitivity of pathogen detection sensors.Keywords: label free biosensor, pathogenic bacteria, graphene oxide, diagnosis
Procedia PDF Downloads 4652602 Reliability of the Estimate of Earthwork Quantity Based on 3D-BIM
Authors: Jaechoul Shin, Juhwan Hwang
Abstract:
In case of applying the BIM method to the civil engineering in the area of free formed structure, we can expect comparatively high rate of construction productivity as it is in the building engineering area. In this research, we developed quantity calculation error applying it to earthwork and bridge construction (e.g. PSC-I type segmental girder bridge amd integrated bridge of steel I-girders and inverted-Tee bent cap), NATM (New Austrian Tunneling Method) tunnel construction, retaining wall construction, culvert construction and implemented BIM based 3D modeling quantity survey. we confirmed high reliability of the BIM-based method in structure work in which errors occurred in range between -6% ~ +5%. Especially, understanding of the problem and improvement of the existing 2D-CAD based of quantity calculation through rock type quantity calculation error in range of -14% ~ +13% of earthwork quantity calculation. It is benefit and applicability of BIM method in civil engineering. In addition, routine method for quantity of earthwork has the same error tolerance negligible for that of structure work. But, rock type's quantity calculated as the error appears significantly to the reliability of 2D-based volume calculation shows that the problem could be. Through the estimating quantity of earthwork based 3D-BIM, proposed method has better reliability than routine method. BIM, as well as the design, construction, maintenance levels of information when you consider the benefits of integration, the introduction of BIM design in civil engineering and the possibility of applying for the effectiveness was confirmed.Keywords: BIM, 3D modeling, 3D-BIM, quantity of earthwork
Procedia PDF Downloads 4412601 Designing the First Oil Tanker Shipyard Facility in Kuwait
Authors: Fatma Al Abdullah, Shahad Al Ameer, Ritaj Jaragh, Fatimah Khajah, Rawan Qambar, Amr Nounou
Abstract:
Kuwait currently manufactures its tankers in foreign countries. Oil tankers play a role in the supply chain of the oil industry. Therefore, with Kuwait’s sufficient financial resources, the country should secure itself strategically in order to protect its oil industry to sustain economic development. The purpose of this report is designing an oil tankers’ shipyard facility. Basing the shipyard facility in Kuwait will have great economic rewards. The shipbuilding industry directly enhances the industrial chain in terms of new job and business opportunities as well as educational fields. Heavy Engineering Industries & Shipbuilding Co. K.S.C. (HEISCO) was chosen as a host due to benefits that will result from HEISCO’s existing infrastructure and expertise to reduce cost. The Facility Design methodology chosen has been used because it covers all aspects needed for the report. The oil tanker market is witnessing a shift from crude tankers to product tankers. Therefore the Panamax tanker (product tanker) was selected to be manufactured in the facility. The different departments needed in shipyards were identified based on studying different global shipyards. Technologies needed to build ships helped in the process design. It was noticed that ships are engineer to order. The new layout development of the proposed shipyard is currently in progress. A feasibility study will be conducted to ensure the success of the facility after developing the shipyard’s layout.Keywords: oil tankers, shipbuilding, shipyard, facility design, Kuwait
Procedia PDF Downloads 4652600 A Clustering Algorithm for Massive Texts
Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen
Abstract:
Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process
Procedia PDF Downloads 4332599 Low-Cost Mechatronic Design of an Omnidirectional Mobile Robot
Authors: S. Cobos-Guzman
Abstract:
This paper presents the results of a mechatronic design based on a 4-wheel omnidirectional mobile robot that can be used in indoor logistic applications. The low-level control has been selected using two open-source hardware (Raspberry Pi 3 Model B+ and Arduino Mega 2560) that control four industrial motors, four ultrasound sensors, four optical encoders, a vision system of two cameras, and a Hokuyo URG-04LX-UG01 laser scanner. Moreover, the system is powered with a lithium battery that can supply 24 V DC and a maximum current-hour of 20Ah.The Robot Operating System (ROS) has been implemented in the Raspberry Pi and the performance is evaluated with the selection of the sensors and hardware selected. The mechatronic system is evaluated and proposed safe modes of power distribution for controlling all the electronic devices based on different tests. Therefore, based on different performance results, some recommendations are indicated for using the Raspberry Pi and Arduino in terms of power, communication, and distribution of control for different devices. According to these recommendations, the selection of sensors is distributed in both real-time controllers (Arduino and Raspberry Pi). On the other hand, the drivers of the cameras have been implemented in Linux and a python program has been implemented to access the cameras. These cameras will be used for implementing a deep learning algorithm to recognize people and objects. In this way, the level of intelligence can be increased in combination with the maps that can be obtained from the laser scanner.Keywords: autonomous, indoor robot, mechatronic, omnidirectional robot
Procedia PDF Downloads 1752598 An Energy Efficient Spectrum Shaping Scheme for Substrate Integrated Waveguides Based on Spread Reshaping Code
Authors: Yu Zhao, Rainer Gruenheid, Gerhard Bauch
Abstract:
In the microwave and millimeter-wave transmission region, substrate-integrated waveguide (SIW) is a very promising candidate for the development of circuits and components. It facilitates the transmission at the data rates in excess of 200 Gbit/s. An SIW mimics a rectangular waveguide by approximating the closed sidewalls with a via fence. This structure suppresses the low frequency components and makes the channel of the SIW a bandpass or high pass filter. This channel characteristic impedes the conventional baseband transmission using non-return-to-zero (NRZ) pulse shaping scheme. Therefore, mixers are commonly proposed to be used as carrier modulator and demodulator in order to facilitate a passband transmission. However, carrier modulation is not an energy efficient solution, because modulation and demodulation at high frequencies consume a lot of energy. For the first time to our knowledge, this paper proposes a spectrum shaping scheme of low complexity for the channel of SIW, namely spread reshaping code. It aims at matching the spectrum of the transmit signal to the channel frequency response. It facilitates the transmission through the SIW channel while it avoids using carrier modulation. In some cases, it even does not need equalization. Simulations reveal a good performance of this scheme, such that, as a result, eye opening is achieved without any equalization or modulation for the respective transmission channels.Keywords: bandpass channel, eye-opening, switching frequency, substrate-integrated waveguide, spectrum shaping scheme, spread reshaping code
Procedia PDF Downloads 1592597 Investigation of Mode II Fracture Toughness in Orthotropic Materials
Authors: Mahdi Fakoor, Nabi Mehri Khansari, Ahmadreza Farokhi
Abstract:
Evaluation of mode II fracture toughness (KIIC) in composite materials is very hard problem to be solved, since it can be affected by many mechanisms of dissipation. Furthermore, non-linearity in its behavior can offer an extra difficulty to obtain accuracy in the results. Different reported values for KIIC in various references can prove the mentioned assertion. In this research, some solutions proposed based on the form of necessary corrections that should be executed on the common test fixtures. Due to the fact that the common test fixtures are not able to active toughening mechanisms in pure Mode II correctly, we have employed some structural modifications on common fixtures. Particularly, the Iosipescu test is used as start point. The tests are applied on graphite/epoxy; PMMA and Western White Pine Wood. Also, mixed mode I/II fracture limit curves are used to indicate the scattering in test results are really relevant to the creation of Fracture Process Zone (FPZ). In the present paper, shear load consideration applied at the predicted shear zone by considering some significant structural amendments that can active mode II toughening mechanisms. Indeed, the employed empirical method causes significant developing in repeatability and reproducibility as well. Moreover, a 3D Finite Element (FE) is performed for verification of the obtained results. Eventually, it is figured out that, a remarkable precision can be obtained in common test fixture in comparison with the previous one.Keywords: FPZ, shear test fixture, mode II fracture toughness, composite material, FEM
Procedia PDF Downloads 3612596 Automated Digital Mammogram Segmentation Using Dispersed Region Growing and Pectoral Muscle Sliding Window Algorithm
Authors: Ayush Shrivastava, Arpit Chaudhary, Devang Kulshreshtha, Vibhav Prakash Singh, Rajeev Srivastava
Abstract:
Early diagnosis of breast cancer can improve the survival rate by detecting cancer at an early stage. Breast region segmentation is an essential step in the analysis of digital mammograms. Accurate image segmentation leads to better detection of cancer. It aims at separating out Region of Interest (ROI) from rest of the image. The procedure begins with removal of labels, annotations and tags from the mammographic image using morphological opening method. Pectoral Muscle Sliding Window Algorithm (PMSWA) is used for removal of pectoral muscle from mammograms which is necessary as the intensity values of pectoral muscles are similar to that of ROI which makes it difficult to separate out. After removing the pectoral muscle, Dispersed Region Growing Algorithm (DRGA) is used for segmentation of mammogram which disperses seeds in different regions instead of a single bright region. To demonstrate the validity of our segmentation method, 322 mammographic images from Mammographic Image Analysis Society (MIAS) database are used. The dataset contains medio-lateral oblique (MLO) view of mammograms. Experimental results on MIAS dataset show the effectiveness of our proposed method.Keywords: CAD, dispersed region growing algorithm (DRGA), image segmentation, mammography, pectoral muscle sliding window algorithm (PMSWA)
Procedia PDF Downloads 3102595 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter
Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal
Abstract:
Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.Keywords: air, component-specific toxicity, human health risks, particulate matter
Procedia PDF Downloads 3092594 Machine Learning Based Approach for Measuring Promotion Effectiveness in Multiple Parallel Promotions’ Scenarios
Authors: Revoti Prasad Bora, Nikita Katyal
Abstract:
Promotion is a key element in the retail business. Thus, analysis of promotions to quantify their effectiveness in terms of Revenue and/or Margin is an essential activity in the retail industry. However, measuring the sales/revenue uplift is based on estimations, as the actual sales/revenue without the promotion is not present. Further, the presence of Halo and Cannibalization in a multiple parallel promotions’ scenario complicates the problem. Calculating Baseline by considering inter-brand/competitor items or using Halo and Cannibalization's impact on Revenue calculations by considering Baseline as an interpretation of items’ unit sales in neighboring nonpromotional weeks individually may not capture the overall Revenue uplift in the case of multiple parallel promotions. Hence, this paper proposes a Machine Learning based method for calculating the Revenue uplift by considering the Halo and Cannibalization impact on the Baseline and the Revenue. In the first section of the proposed methodology, Baseline of an item is calculated by incorporating the impact of the promotions on its related items. In the later section, the Revenue of an item is calculated by considering both Halo and Cannibalization impacts. Hence, this methodology enables correct calculation of the overall Revenue uplift due a given promotion.Keywords: Halo, Cannibalization, promotion, Baseline, temporary price reduction, retail, elasticity, cross price elasticity, machine learning, random forest, linear regression
Procedia PDF Downloads 1752593 Experimental Investigation, Analysis and Optimization of Performance and Emission Characteristics of Composite Oil Methyl Esters at 160 bar, 180 bar and 200 bar Injection Pressures by Multifunctional Criteria Technique
Authors: Yogish Huchaiah, Chandrashekara Krishnappa
Abstract:
This study considers the optimization and validation of experimental results using Multi-Functional Criteria Technique (MFCT). MFCT is concerned with structuring and solving decision and planning problems involving multiple variables. Production of biodiesel from Composite Oil Methyl Esters (COME) of Jatropha and Pongamia oils, mixed in various proportions and Biodiesel thus obtained from two step transesterification process were tested for various Physico-Chemical properties and it has been ascertained that they were within limits proposed by ASTME. They were blended with Petrodiesel in various proportions. These Methyl Esters were blended with Petrodiesel in various proportions and coded. These blends were used as fuels in a computerized CI DI engine to investigate Performance and Emission characteristics. From the analysis of results, it was found that 180MEM4B20 blend had the maximum Performance and minimum Emissions. To validate the experimental results, MFCT was used. Characteristics such as Fuel Consumption (FC), Brake Power (BP), Brake Specific Fuel Consumption (BSFC), Brake Thermal Efficiency (BTE), Carbon dioxide (CO2), Carbon Monoxide (CO), Hydro Carbon (HC) and Nitrogen oxide (NOx) were considered as dependent variables. It was found from the application of this method that the optimized combination of Injection Pressure (IP), Mix and Blend is 178MEM4.2B24. Overall corresponding variation between optimization and experimental results was found to be 7.45%.Keywords: COME, IP, MFCT, optimization, PI, PN, PV
Procedia PDF Downloads 2102592 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems
Authors: Rodolfo Lorbieski, Silvia Modesto Nassar
Abstract:
Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.Keywords: stacking, multi-layers, ensemble, multi-class
Procedia PDF Downloads 2682591 Machine Learning for Classifying Risks of Death and Length of Stay of Patients in Intensive Unit Care Beds
Authors: Itamir de Morais Barroca Filho, Cephas A. S. Barreto, Ramon Malaquias, Cezar Miranda Paula de Souza, Arthur Costa Gorgônio, João C. Xavier-Júnior, Mateus Firmino, Fellipe Matheus Costa Barbosa
Abstract:
Information and Communication Technologies (ICT) in healthcare are crucial for efficiently delivering medical healthcare services to patients. These ICTs are also known as e-health and comprise technologies such as electronic record systems, telemedicine systems, and personalized devices for diagnosis. The focus of e-health is to improve the quality of health information, strengthen national health systems, and ensure accessible, high-quality health care for all. All the data gathered by these technologies make it possible to help clinical staff with automated decisions using machine learning. In this context, we collected patient data, such as heart rate, oxygen saturation (SpO2), blood pressure, respiration, and others. With this data, we were able to develop machine learning models for patients’ risk of death and estimate the length of stay in ICU beds. Thus, this paper presents the methodology for applying machine learning techniques to develop these models. As a result, although we implemented these models on an IoT healthcare platform, helping clinical staff in healthcare in an ICU, it is essential to create a robust clinical validation process and monitoring of the proposed models.Keywords: ICT, e-health, machine learning, ICU, healthcare
Procedia PDF Downloads 1072590 Structural Testing and the Finite Element Modelling of Anchors Loaded Against Partially Confined Surfaces
Authors: Ali Karrech, Alberto Puccini, Ben Galvin, Davide Galli
Abstract:
This paper summarises the laboratory tests, numerical models and statistical approach developed to investigate the behaviour of concrete blocks loaded in shear through metallic anchors. This research is proposed to bridge a gap in the state of the art and practice related to anchors loaded against partially confined concrete surfaces. Eight concrete blocks (420 mm x 500 mm x 1000 mm) with 150 and/or 250 deep anchors were tested. The stainless-steel anchors of diameter 16 mm were bonded with HIT-RE 500 V4 injection epoxy resin and were subjected to shear loading against partially supported edges. In addition, finite element models were constructed to validate the laboratory tests and explore the influence of key parameters such as anchor depth, anchor distance from the edge, and compressive strength on the stability of the block. Upon their validation experimentally, the numerical results were used to populate, develop and interpret a systematic parametric study based on the Design of Experiment approach through the Box-Behnken design and Response Surface Methodology. An empirical model has been derived based on this approach, which predicts the load capacity with the desirable intervals of confidence.Keywords: finite element modelling, design of experiment, response surface methodology, Box-Behnken design, empirical model, interval of confidence, load capacity
Procedia PDF Downloads 21