Search results for: Random simple polygon generation.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7888

Search results for: Random simple polygon generation.

5788 Drawbacks of Second Generation Urban Re-Development in Addis Ababa

Authors: Ezana Haddis Weldeghebrael

Abstract:

Addis Ababa City Administration is engaged in a massive facelift of the inner-city. The paper, therefore, aims to analyze the challenges of the current urban regeneration effort by paying special attention to Lideta and Basha Wolde Chilot projects. To this end, the paper has adopted a documentary research strategy to collect the data and Institutionalist perspective as well as the concept of urban regeneration to analyze the data. The sources were selected based on relevance and recency. Academic research outputs were used primarily. However, where much scholastic publications are not available institutional reports, newspaper articles, and expert presentations were used. The major findings of the research revealed that although the second generation of urban redevelopment projects have attempted to involve affected groups and succeeded in designing better neighborhoods, they are riddled with three major drawbacks. The first one is institutional constraints, i.e. absence of urban redevelopment strategy as well as housing policy, broad definition of ‘public purpose’, little regard for informal businesses, limitation on rights groups, negotiation power not devolved at sub-city level and no plan for groups that cannot afford to pay the down payment for low-cost apartments. The second one is planning limitation, i.e. absence of genuine affected group participation as well as consultative level of public engagement. The third one is implementation failure, i.e. no regard to maintaining social bond, non-participatory and ill-informed resettlement, interference from senior government officials, failure to protect the poor from speculators, corruption and disregard to heritage buildings. Based on the findings, the paper concluded that the current inner-city redevelopment has failed to be socially sustainable and calls for enactment of housing policy as well as redevelopment strategy, affected group participation, on-site resettlement, empowering the Sub-city to manage the project and allowing housing rights groups to advocate for the poor slum dwellers.

Keywords: participation, redevelopment, planning, implementation, consultation

Procedia PDF Downloads 413
5787 Impact of Revenue Reform on Vulnerable Communities

Authors: Pauliasi Tony Fakahau

Abstract:

This paper provides an overview of the impact of the revenue reform programme on vulnerable communities in the Kingdom of Tonga. Economic turmoil and mismanagement during the late 1990s forced the government to seek technical and financial assistance from the Asian Development Bank to undertake a comprehensive Economic and Public Sector Reform (EPSR) programme. The EPSR is a Western model recommended by donor agencies as the solution to Tonga’s economic challenges. The EPSR programme included public sector reform, private sector growth, and revenue generation. Tax reform was the main tool for revenue generation, which set out to strengthen tax compliance and administration as well as implement a value-added consumption tax. The EPSR is based on Western values and ideology but failed to recognise that Tongan cultural values are important to the local community. Two participant groups were interviewed. Participant group one consisted of 51 people representing vulnerable communities. Participant group two consisted of six people from the government and business sector who were from the elite of Tongan society. The Kakala Research Methodology provided the framework for the research, and the Talanoa Research Method was used to conduct semi-structured interviews in the homes of the first group and in the workplaces of the second group. The research found a heavy burden of the consumption tax on the purchasing power of participant group one (vulnerable participants), having an impact on nearly every financial transaction they made. Participant group ones’ main financial priorities were kavenga fakalotu (obligations to the church), kavenga fakafāmili (obligations to the family) and kavenga fakafonua (obligations to cultural events for the village, nobility, and royalty). The findings identified inequalities of the revenue reform, especially from consumption tax, for vulnerable people and communities compared to the elite of society. The research concluded that government and donor agencies need ameliorating policies to reduce the burden of tax on vulnerable groups more susceptible to the impact of revenue reform.

Keywords: tax reform, tonga vulnerable community revenue, revenue reform, public sector reform

Procedia PDF Downloads 104
5786 Generation of ZnO-Au Nanocomposite in Water Using Pulsed Laser Irradiation

Authors: Elmira Solati, Atousa Mehrani, Davoud Dorranian

Abstract:

Generation of ZnO-Au nanocomposite under laser irradiation of a mixture of the ZnO and Au colloidal suspensions are experimentally investigated. In this work, firstly ZnO and Au nanoparticles are prepared by pulsed laser ablation of the corresponding metals in water using the 1064 nm wavelength of Nd:YAG laser. In a second step, the produced ZnO and Au colloidal suspensions were mixed in different volumetric ratio and irradiated using the second harmonic of a Nd:YAG laser operating at 532 nm wavelength. The changes in the size of the nanostructure and optical properties of the ZnO-Au nanocomposite are studied as a function of the volumetric ratio of ZnO and Au colloidal suspensions. The crystalline structure of the ZnO-Au nanocomposites was analyzed by X-ray diffraction (XRD). The optical properties of the samples were examined at room temperature by a UV-Vis-NIR absorption spectrophotometer. Transmission electron microscopy (TEM) was done by placing a drop of the concentrated suspension on a carbon-coated copper grid. To further confirm the morphology of ZnO-Au nanocomposites, we performed Scanning electron microscopy (SEM) analysis. Room temperature photoluminescence (PL) of the ZnO-Au nanocomposites was measured to characterize the luminescence properties of the ZnO-Au nanocomposites. The ZnO-Au nanocomposites were characterized by Fourier transform infrared (FTIR) spectroscopy. The X-ray diffraction pattern shows that the ZnO-Au nanocomposites had the polycrystalline structure of Au. The behavior observed by images of transmission electron microscope reveals that soldering of Au and ZnO nanoparticles include their adhesion. The plasmon peak in ZnO-Au nanocomposites was red-shifted and broadened in comparison with pure Au nanoparticles. By using the Tauc’s equation, the band gap energy for ZnO-Au nanocomposites is calculated to be 3.15–3.27 eV. In this work, the formation of ZnO-Au nanocomposites shifts the FTIR peak of metal oxide bands to higher wavenumbers. PL spectra of the ZnO-Au nanocomposites show that several weak peaks in the ultraviolet region and several relatively strong peaks in the visible region. SEM image indicates that the morphology of ZnO-Au nanocomposites produced in water was spherical. The TEM images of ZnO-Au nanocomposites demonstrate that with increasing the volumetric ratio of Au colloidal suspension the adhesion increased. According to the size distribution graphs of ZnO-Au nanocomposites with increasing the volumetric ratio of Au colloidal suspension the amount of ZnO-Au nanocomposites with the smaller size is further.

Keywords: Au nanoparticles, pulsed laser ablation, ZnO-Au nanocomposites, ZnO nanoparticles

Procedia PDF Downloads 322
5785 Quantum Confinement in LEEH Capped CdS Nanocrystalline

Authors: Mihir Hota, Namita Jena, S. N. Sahu

Abstract:

LEEH (L-cysteine ethyl ester hydrochloride) capped CdS semiconductor nanocrystals are grown at 800C using a simple chemical route. Photoluminescence (PL), Optical absorption (UV) and Transmission Electron Microscopy (TEM) have been carried out to evaluate the structural and optical properties of the nanocrystal. Optical absorption studies have been carried out to optimize the sample. XRD and TEM analysis shows that the nanocrystal belongs to FCC structure having average size of 3nm while a bandgap of 2.84eV is estimated from Photoluminescence analysis. The nanocrystal emits bluish light when excited with 355nm LASER.

Keywords: cadmium sulphide, nanostructures, luminescence, optical properties

Procedia PDF Downloads 384
5784 On the Zeros of the Degree Polynomial of a Graph

Authors: S. R. Nayaka, Putta Swamy

Abstract:

Graph polynomial is one of the algebraic representations of the Graph. The degree polynomial is one of the simple algebraic representations of graphs. The degree polynomial of a graph G of order n is the polynomial Deg(G, x) with the coefficients deg(G,i) where deg(G,i) denotes the number of vertices of degree i in G. In this article, we investigate the behavior of the roots of some families of Graphs in the complex field. We investigate for the graphs having only integral roots. Further, we characterize the graphs having single roots or having real roots and behavior of the polynomial at the particular value is also obtained.

Keywords: degree polynomial, regular graph, minimum and maximum degree, graph operations

Procedia PDF Downloads 229
5783 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 134
5782 Comparison of the Effectiveness of Tree Algorithms in Classification of Spongy Tissue Texture

Authors: Roza Dzierzak, Waldemar Wojcik, Piotr Kacejko

Abstract:

Analysis of the texture of medical images consists of determining the parameters and characteristics of the examined tissue. The main goal is to assign the analyzed area to one of two basic groups: as a healthy tissue or a tissue with pathological changes. The CT images of the thoracic lumbar spine from 15 healthy patients and 15 with confirmed osteoporosis were used for the analysis. As a result, 120 samples with dimensions of 50x50 pixels were obtained. The set of features has been obtained based on the histogram, gradient, run-length matrix, co-occurrence matrix, autoregressive model, and Haar wavelet. As a result of the image analysis, 290 descriptors of textural features were obtained. The dimension of the space of features was reduced by the use of three selection methods: Fisher coefficient (FC), mutual information (MI), minimization of the classification error probability and average correlation coefficients between the chosen features minimization of classification error probability (POE) and average correlation coefficients (ACC). Each of them returned ten features occupying the initial place in the ranking devised according to its own coefficient. As a result of the Fisher coefficient and mutual information selections, the same features arranged in a different order were obtained. In both rankings, the 50% percentile (Perc.50%) was found in the first place. The next selected features come from the co-occurrence matrix. The sets of features selected in the selection process were evaluated using six classification tree methods. These were: decision stump (DS), Hoeffding tree (HT), logistic model trees (LMT), random forest (RF), random tree (RT) and reduced error pruning tree (REPT). In order to assess the accuracy of classifiers, the following parameters were used: overall classification accuracy (ACC), true positive rate (TPR, classification sensitivity), true negative rate (TNR, classification specificity), positive predictive value (PPV) and negative predictive value (NPV). Taking into account the classification results, it should be stated that the best results were obtained for the Hoeffding tree and logistic model trees classifiers, using the set of features selected by the POE + ACC method. In the case of the Hoeffding tree classifier, the highest values of three parameters were obtained: ACC = 90%, TPR = 93.3% and PPV = 93.3%. Additionally, the values of the other two parameters, i.e., TNR = 86.7% and NPV = 86.6% were close to the maximum values obtained for the LMT classifier. In the case of logistic model trees classifier, the same ACC value was obtained ACC=90% and the highest values for TNR=88.3% and NPV= 88.3%. The values of the other two parameters remained at a level close to the highest TPR = 91.7% and PPV = 91.6%. The results obtained in the experiment show that the use of classification trees is an effective method of classification of texture features. This allows identifying the conditions of the spongy tissue for healthy cases and those with the porosis.

Keywords: classification, feature selection, texture analysis, tree algorithms

Procedia PDF Downloads 158
5781 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model

Authors: Zina Benouaret, Djamil Aissani

Abstract:

In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.

Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis

Procedia PDF Downloads 237
5780 A Next Generation Multi-Scale Modeling Theatre for in silico Oncology

Authors: Safee Chaudhary, Mahnoor Naseer Gondal, Hira Anees Awan, Abdul Rehman, Ammar Arif, Risham Hussain, Huma Khawar, Zainab Arshad, Muhammad Faizyab Ali Chaudhary, Waleed Ahmed, Muhammad Umer Sultan, Bibi Amina, Salaar Khan, Muhammad Moaz Ahmad, Osama Shiraz Shah, Hadia Hameed, Muhammad Farooq Ahmad Butt, Muhammad Ahmad, Sameer Ahmed, Fayyaz Ahmed, Omer Ishaq, Waqar Nabi, Wim Vanderbauwhede, Bilal Wajid, Huma Shehwana, Muhammad Tariq, Amir Faisal

Abstract:

Cancer is a manifestation of multifactorial deregulations in biomolecular pathways. These deregulations arise from the complex multi-scale interplay between cellular and extracellular factors. Such multifactorial aberrations at gene, protein, and extracellular scales need to be investigated systematically towards decoding the underlying mechanisms and orchestrating therapeutic interventions for patient treatment. In this work, we propose ‘TISON’, a next-generation web-based multiscale modeling platform for clinical systems oncology. TISON’s unique modeling abstraction allows a seamless coupling of information from biomolecular networks, cell decision circuits, extra-cellular environments, and tissue geometries. The platform can undertake multiscale sensitivity analysis towards in silico biomarker identification and drug evaluation on cellular phenotypes in user-defined tissue geometries. Furthermore, integration of cancer expression databases such as The Cancer Genome Atlas (TCGA) and Human Proteome Atlas (HPA) facilitates in the development of personalized therapeutics. TISON is the next-evolution of multiscale cancer modeling and simulation platforms and provides a ‘zero-code’ model development, simulation, and analysis environment for application in clinical settings.

Keywords: systems oncology, cancer systems biology, cancer therapeutics, personalized therapeutics, cancer modelling

Procedia PDF Downloads 202
5779 The Estimation Method of Stress Distribution for Beam Structures Using the Terrestrial Laser Scanning

Authors: Sang Wook Park, Jun Su Park, Byung Kwan Oh, Yousok Kim, Hyo Seon Park

Abstract:

This study suggests the estimation method of stress distribution for the beam structures based on TLS (Terrestrial Laser Scanning). The main components of method are the creation of the lattices of raw data from TLS to satisfy the suitable condition and application of CSSI (Cubic Smoothing Spline Interpolation) for estimating stress distribution. Estimation of stress distribution for the structural member or the whole structure is one of the important factors for safety evaluation of the structure. Existing sensors which include ESG (Electric strain gauge) and LVDT (Linear Variable Differential Transformer) can be categorized as contact type sensor which should be installed on the structural members and also there are various limitations such as the need of separate space where the network cables are installed and the difficulty of access for sensor installation in real buildings. To overcome these problems inherent in the contact type sensors, TLS system of LiDAR (light detection and ranging), which can measure the displacement of a target in a long range without the influence of surrounding environment and also get the whole shape of the structure, has been applied to the field of structural health monitoring. The important characteristic of TLS measuring is a formation of point clouds which has many points including the local coordinate. Point clouds is not linear distribution but dispersed shape. Thus, to analyze point clouds, the interpolation is needed vitally. Through formation of averaged lattices and CSSI for the raw data, the method which can estimate the displacement of simple beam was developed. Also, the developed method can be extended to calculate the strain and finally applicable to estimate a stress distribution of a structural member. To verify the validity of the method, the loading test on a simple beam was conducted and TLS measured it. Through a comparison of the estimated stress and reference stress, the validity of the method is confirmed.

Keywords: structural healthcare monitoring, terrestrial laser scanning, estimation of stress distribution, coordinate transformation, cubic smoothing spline interpolation

Procedia PDF Downloads 422
5778 Molecular Dynamics Simulation of Realistic Biochar Models with Controlled Microporosity

Authors: Audrey Ngambia, Ondrej Masek, Valentina Erastova

Abstract:

Biochar is an amorphous carbon-rich material generated from the pyrolysis of biomass with multifarious properties and functionality. Biochar has shown proven applications in the treatment of flue gas and organic and inorganic pollutants in soil and water/wastewater as a result of its multiple surface functional groups and porous structures. These properties have also shown potential in energy storage and carbon capture. The availability of diverse sources of biomass to produce biochar has increased interest in it as a sustainable and environmentally friendly material. The properties and porous structures of biochar vary depending on the type of biomass and high heat treatment temperature (HHT). Biochars produced at HHT between 400°C – 800°C generally have lower H/C and O/C ratios, higher porosities, larger pore sizes and higher surface areas with temperature. While all is known experimentally, there is little knowledge on the porous role structure and functional groups play on processes occurring at the atomistic scale, which are extremely important for the optimization of biochar for application, especially in the adsorption of gases. Atomistic simulations methods have shown the potential to generate such amorphous materials; however, most of the models available are composed of only carbon atoms or graphitic sheets, which are very dense or with simple slit pores, all of which ignore the important role of heteroatoms such as O, N, S and pore morphologies. Hence, developing realistic models that integrate these parameters are important to understand their role in governing adsorption mechanisms that will aid in guiding the design and optimization of biochar materials for target applications. In this work, molecular dynamics simulations in the isobaric ensemble are used to generate realistic biochar models taking into account experimentally determined H/C, O/C, N/C, aromaticity, micropore size range, micropore volumes and true densities of biochars. A pore generation approach was developed using virtual atoms, which is a Lennard-Jones sphere of varying van der Waals radius and softness. Its interaction via a soft-core potential with the biochar matrix allows the creation of pores with rough surfaces while varying the van der Waals radius parameters gives control to the pore-size distribution. We focused on microporosity, creating average pore sizes of 0.5 - 2 nm in diameter and pore volumes in the range of 0.05 – 1 cm3/g, which corresponds to experimental gas adsorption micropore sizes of amorphous porous biochars. Realistic biochar models with surface functionalities, micropore size distribution and pore morphologies were developed, and they could aid in the study of adsorption processes in confined micropores.

Keywords: biochar, heteroatoms, micropore size, molecular dynamics simulations, surface functional groups, virtual atoms

Procedia PDF Downloads 54
5777 An Engineer-Oriented Life Cycle Assessment Tool for Building Carbon Footprint: The Building Carbon Footprint Evaluation System in Taiwan

Authors: Hsien-Te Lin

Abstract:

The purpose of this paper is to introduce the BCFES (building carbon footprint evaluation system), which is a LCA (life cycle assessment) tool developed by the Low Carbon Building Alliance (LCBA) in Taiwan. A qualified BCFES for the building industry should fulfill the function of evaluating carbon footprint throughout all stages in the life cycle of building projects, including the production, transportation and manufacturing of materials, construction, daily energy usage, renovation and demolition. However, many existing BCFESs are too complicated and not very designer-friendly, creating obstacles in the implementation of carbon reduction policies. One of the greatest obstacle is the misapplication of the carbon footprint inventory standards of PAS2050 or ISO14067, which are designed for mass-produced goods rather than building projects. When these product-oriented rules are applied to building projects, one must compute a tremendous amount of data for raw materials and the transportation of construction equipment throughout the construction period based on purchasing lists and construction logs. This verification method is very cumbersome by nature and unhelpful to the promotion of low carbon design. With a view to provide an engineer-oriented BCFE with pre-diagnosis functions, a component input/output (I/O) database system and a scenario simulation method for building energy are proposed herein. Most existing BCFESs base their calculations on a product-oriented carbon database for raw materials like cement, steel, glass, and wood. However, data on raw materials is meaningless for the purpose of encouraging carbon reduction design without a feedback mechanism, because an engineering project is not designed based on raw materials but rather on building components, such as flooring, walls, roofs, ceilings, roads or cabinets. The LCBA Database has been composited from existing carbon footprint databases for raw materials and architectural graphic standards. Project designers can now use the LCBA Database to conduct low carbon design in a much more simple and efficient way. Daily energy usage throughout a building's life cycle, including air conditioning, lighting, and electric equipment, is very difficult for the building designer to predict. A good BCFES should provide a simplified and designer-friendly method to overcome this obstacle in predicting energy consumption. In this paper, the author has developed a simplified tool, the dynamic Energy Use Intensity (EUI) method, to accurately predict energy usage with simple multiplications and additions using EUI data and the designed efficiency levels for the building envelope, AC, lighting and electrical equipment. Remarkably simple to use, it can help designers pre-diagnose hotspots in building carbon footprint and further enhance low carbon designs. The BCFES-LCBA offers the advantages of an engineer-friendly component I/O database, simplified energy prediction methods, pre-diagnosis of carbon hotspots and sensitivity to good low carbon designs, making it an increasingly popular carbon management tool in Taiwan. To date, about thirty projects have been awarded BCFES-LCBA certification and the assessment has become mandatory in some cities.

Keywords: building carbon footprint, life cycle assessment, energy use intensity, building energy

Procedia PDF Downloads 128
5776 Back to Basics: Where is Allah? A Survey of Generation Z Youth at the Canadian University of Dubai

Authors: Said Baadel

Abstract:

The belief of a heavenly God is enshrined to all Abrahamic religions which form the three major religions of the world today. Muslims believe in Allah who is above the seven heavens. The youth in the United Arab Emirates (UAE) study Islamic courses as part of their high school curriculum and are required to take at least one Islamic course at the university level to gain credit hours towards their general education (GENED). This paper provides an insight of what the youth studying in the UAE think of where Allah was. Our analysis reveals that a big number of Muslim youth were not sure, especially those from the Middle Eastern and Arab countries bringing to the conclusion that this subject needs to be revisited again in the course work.

Keywords: Allah, Islam, Tawheed, religion

Procedia PDF Downloads 218
5775 Identification of Rare Mutations in Genes Involved in Monogenic Forms of Obesity and Diabetes in Obese Guadeloupean Children through Next-Generation Sequencing

Authors: Lydia Foucan, Laurent Larifla, Emmanuelle Durand, Christine Rambhojan, Veronique Dhennin, Jean-Marc Lacorte, Philippe Froguel, Amelie Bonnefond

Abstract:

In the population of Guadeloupe Island (472,124 inhabitants and 80% of subjects of African descent), overweight and obesity were estimated at 23% and 9% respectively among children. High prevalence of diabetes has been reported (~10%) in the adult population. Nevertheless, no study has investigated the contribution of gene mutations to childhood obesity in this population. We aimed to investigate rare genetic mutations in genes involved in monogenic obesity or diabetes in obese Afro-Caribbean children from Guadeloupe Island using next-generation sequencing. The present investigation included unrelated obese children, from a previous study on overweight conducted in Guadeloupe Island in 2013. We sequenced coding regions of 59 genes involved in monogenic obesity or diabetes. A total of 25 obese schoolchildren (with Z-score of body mass index [BMI]: 2.0 to 2.8) were screened for rare mutations (non-synonymous, splice-site, or insertion/deletion) in 59 genes. Mean age of the study population was 12.4 ± 1.1 years. Seventeen children (68%) had insulin-resistance (HOMA-IR > 3.16). A family history of obesity (mother or father) was observed in eight children and three of the accompanying parent presented with type 2 diabetes. None of the children had gonadotrophic abnormality or mental retardation. We detected five rare heterozygous mutations, in four genes involved in monogenic obesity, in five different obese children: MC4R p.Ile301Thr and SIM1 p.Val326Thrfs*43 mutations which were pathogenic; SIM1 p.Ser343Pro and SH2B1 p.Pro90His mutations which were likely pathogenic; and NTRK2 p.Leu140Phe that was of uncertain significance. In parallel, we identified seven carriers of mutation in ABCC8 or KCNJ11 (involved in monogenic diabetes), which were of uncertain significance (KCNJ11 p.Val13Met, KCNJ11 p.Val151Met, ABCC8 p.Lys1521Asn and ABCC8 p.Ala625Val). Rare pathogenic or likely pathogenic mutations, linked to severe obesity were detected in more than 15% of this Afro-Caribbean population at high risk of obesity and type 2 diabetes.

Keywords: childhood obesity, MC4R, monogenic obesity, SIM1

Procedia PDF Downloads 176
5774 The French Ekang Ethnographic Dictionary. The Quantum Approach

Authors: Henda Gnakate Biba, Ndassa Mouafon Issa

Abstract:

Dictionaries modeled on the Western model [tonic accent languages] are not suitable and do not account for tonal languages phonologically, which is why the [prosodic and phonological] ethnographic dictionary was designed. It is a glossary that expresses the tones and the rhythm of words. It recreates exactly the speaking or singing of a tonal language, and allows the non-speaker of this language to pronounce the words as if they were a native. It is a dictionary adapted to tonal languages. It was built from ethnomusicological theorems and phonological processes, according to Jean. J. Rousseau 1776 hypothesis /To say and to sing were once the same thing/. Each word in the French dictionary finds its corresponding language, ekaη. And each word ekaη is written on a musical staff. This ethnographic dictionary is also an inventive, original and innovative research thesis, but it is also an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and, world music or, variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.

Keywords: music, language, entenglement, science, research

Procedia PDF Downloads 52
5773 Secure E-Pay System Using Steganography and Visual Cryptography

Authors: K. Suganya Devi, P. Srinivasan, M. P. Vaishnave, G. Arutperumjothi

Abstract:

Today’s internet world is highly prone to various online attacks, of which the most harmful attack is phishing. The attackers host the fake websites which are very similar and look alike. We propose an image based authentication using steganography and visual cryptography to prevent phishing. This paper presents a secure steganographic technique for true color (RGB) images and uses Discrete Cosine Transform to compress the images. The proposed method hides the secret data inside the cover image. The use of visual cryptography is to preserve the privacy of an image by decomposing the original image into two shares. Original image can be identified only when both qualified shares are simultaneously available. Individual share does not reveal the identity of the original image. Thus, the existence of the secret message is hard to be detected by the RS steganalysis.

Keywords: image security, random LSB, steganography, visual cryptography

Procedia PDF Downloads 315
5772 Adaptive Conjoint Analysis of Professionals’ Job Preferences

Authors: N. Scheidegger, A. Mueller

Abstract:

Job preferences are a well-developed research field. Many studies analyze the preferences using simple ratings with a sample of university graduates. The current study analyzes the preferences with a mixed method approach of a qualitative preliminary study and adaptive conjoint-analysis. Preconditions of accepting job offers are clarified for professionals in the industrial sector. It could be shown that, e.g. wages above the average are critical and that career opportunities must be seen broader than merely a focus on formal personnel development programs. The results suggest that, to be effective with their recruitment efforts, employers must take into account key desirable job attributes of their target group.

Keywords: conjoint analysis, employer attractiveness, job preferences, personnel marketing

Procedia PDF Downloads 186
5771 The Lexical Eidos as an Invariant of a Polysemantic Word

Authors: S. Pesina, T. Solonchak

Abstract:

Phenomenological analysis is not based on natural language, but ideal language which is able to be a carrier of ideal meanings – eidos representing typical structures or essences. For this purpose, it’s necessary to release from the spatio-temporal definiteness of a subject and then state its noetic essence (eidos) by means of free fantasy generation. Herewith, as if a totally new objectness is created - the universal, confirming the thesis that thinking process takes place in generalizations passing by numerous means through the specific to the general and from the general through the specific to the singular.

Keywords: lexical eidos, phenomenology, noema, polysemantic word, semantic core

Procedia PDF Downloads 262
5770 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning

Authors: Xingyu Gao, Qiang Wu

Abstract:

Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.

Keywords: patent influence, interpretable machine learning, predictive models, SHAP

Procedia PDF Downloads 29
5769 Simulation of Channel Models for Device-to-Device Application of 5G Urban Microcell Scenario

Authors: H. Zormati, J. Chebil, J. Bel Hadj Tahar

Abstract:

Next generation wireless transmission technology (5G) is expected to support the development of channel models for higher frequency bands, so clarification of high frequency bands is the most important issue in radio propagation research for 5G, multiple urban microcellular measurements have been carried out at 60 GHz. In this paper, the collected data is uniformly analyzed with focus on the path loss (PL), the objective is to compare simulation results of some studied channel models with the purpose of testing the performance of each one.

Keywords: 5G, channel model, 60GHz channel, millimeter-wave, urban microcell

Procedia PDF Downloads 296
5768 Thulium Laser Design and Experimental Verification for NIR and MIR Nonlinear Applications in Specialty Optical Fibers

Authors: Matej Komanec, Tomas Nemecek, Dmytro Suslov, Petr Chvojka, Stanislav Zvanovec

Abstract:

Nonlinear phenomena in the near- and mid-infrared region are attracting scientific attention mainly due to the supercontinuum generation possibilities and subsequent utilizations for ultra-wideband applications like e.g. absorption spectroscopy or optical coherence tomography. Thulium-based fiber lasers provide access to high-power ultrashort pump pulses in the vicinity of 2000 nm, which can be easily exploited for various nonlinear applications. The paper presents a simulation and experimental study of a pulsed thulium laser based for near-infrared (NIR) and mid-infrared (MIR) nonlinear applications in specialty optical fibers. In the first part of the paper the thulium laser is discussed. The thulium laser is based on a gain-switched seed-laser and a series of amplification stages for obtaining output peak powers in the order of kilowatts for pulses shorter than 200 ps in full-width at half-maximum. The pulsed thulium laser is first studied in a simulation software, focusing on seed-laser properties. Afterward, a pre-amplification thulium-based stage is discussed, with the focus of low-noise signal amplification, high signal gain and eliminating pulse distortions during pulse propagation in the gain medium. Following the pre-amplification stage a second gain stage is evaluated with incorporating a thulium-fiber of shorter length with increased rare-earth dopant ratio. Last a power-booster stage is analyzed, where the peak power of kilowatts should be achieved. Examples of analytical study are further validated by the experimental campaign. The simulation model is further corrected based on real components – parameters such as real insertion-losses, cross-talks, polarization dependencies, etc. are included. The second part of the paper evaluates the utilization of nonlinear phenomena, their specific features at the vicinity of 2000 nm, compared to e.g. 1550 nm, and presents supercontinuum modelling, based on the thulium laser pulsed output. Supercontinuum generation simulation is performed and provides reasonably accurate results, once fiber dispersion profile is precisely defined and fiber nonlinearity is known, furthermore input pulse shape and peak power must be known, which is assured thanks to the experimental measurement of the studied thulium pulsed laser. The supercontinuum simulation model is put in relation to designed and characterized specialty optical fibers, which are discussed in the third part of the paper. The focus is placed on silica and mainly on non-silica fibers (fluoride, chalcogenide, lead-silicate) in their conventional, microstructured or tapered variants. Parameters such as dispersion profile and nonlinearity of exploited fibers were characterized either with an accurate model, developed in COMSOL software or by direct experimental measurement to achieve even higher precision. The paper then combines all three studied topics and presents a possible application of such a thulium pulsed laser system working with specialty optical fibers.

Keywords: nonlinear phenomena, specialty optical fibers, supercontinuum generation, thulium laser

Procedia PDF Downloads 301
5767 A Study of Anthropometric Correlation between Upper and Lower Limb Dimensions in Sudanese Population

Authors: Altayeb Abdalla Ahmed

Abstract:

Skeletal phenotype is a product of a balanced interaction between genetics and environmental factors throughout different life stages. Therefore, interlimb proportions are variable between populations. Although interlimb proportion indices have been used in anthropology in assessing the influence of various environmental factors on limbs, an extensive literature review revealed that there is a paucity of published research assessing interlimb part correlations and possibility of reconstruction. Hence, this study aims to assess the relationships between upper and lower limb parts and develop regression formulae to reconstruct the parts from one another. The left upper arm length, ulnar length, wrist breadth, hand length, hand breadth, tibial length, bimalleolar breadth, foot length, and foot breadth of 376 right-handed subjects, comprising 187 males and 189 females (aged 25-35 years), were measured. Initially, the data were analyzed using basic univariate analysis and independent t-tests; then sex-specific simple and multiple linear regression models were used to estimate upper limb parts from lower limb parts and vice-versa. The results of this study indicated significant sexual dimorphism for all variables. The results indicated a significant correlation between the upper and lower limbs parts (p < 0.01). Linear and multiple (stepwise) regression equations were developed to reconstruct the limb parts in the presence of a single or multiple dimension(s) from the other limb. Multiple stepwise regression equations generated better reconstructions than simple equations. These results are significant in forensics as it can aid in identification of multiple isolated limb parts particularly during mass disasters and criminal dismemberment. Although a DNA analysis is the most reliable tool for identification, its usage has multiple limitations in undeveloped countries, e.g., cost, facility availability, and trained personnel. Furthermore, it has important implication in plastic and orthopedic reconstructive surgeries. This study is the only reported study assessing the correlation and prediction capabilities between many of the upper and lower dimensions. The present study demonstrates a significant correlation between the interlimb parts in both sexes, which indicates a possibility to reconstruction using regression equations.

Keywords: anthropometry, correlation, limb, Sudanese

Procedia PDF Downloads 282
5766 Relevance of Copyright and Trademark in the Gaming Industry

Authors: Deeksha Karunakar

Abstract:

The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.

Keywords: copyright, DMCA, gaming industry, trademark, WIPO

Procedia PDF Downloads 53
5765 Cognitive Performance and Physiological Stress during an Expedition in Antarctica

Authors: Andrée-Anne Parent, Alain-Steve Comtois

Abstract:

The Antarctica environment can be a great challenge for human exploration. Explorers need to be focused on the task and require the physical abilities to succeed and survive in complete autonomy in this hostile environment. The aim of this study was to observe cognitive performance and physiological stress with a biomarker (cortisol) and hand grip strength during an expedition in Antarctica. A total of 6 explorers were in complete autonomous exploration on the Forbidden Plateau in Antarctica to reach unknown summits during a 30 day period. The Stroop Test, a simple reaction time, and mood scale (PANAS) tests were performed every week during the expedition. Saliva samples were taken before sailing to Antarctica, the first day on the continent, after the mission on the continent and on the boat return trip. Furthermore, hair samples were taken before and after the expedition. The results were analyzed with SPSS using ANOVA repeated measures. The Stroop and mood scale results are presented in the following order: 1) before sailing to Antarctica, 2) the first day on the continent, 3) after the mission on the continent and 4) on the boat return trip. No significant difference was observed with the Stroop (759±166 ms, 850±114 ms, 772±179 ms and 833±105 ms, respectively) and the PANAS (39.5 ±5.7, 40.5±5, 41.8±6.9, 37.3±5.8 positive emotions, and 17.5±2.3, 18.2±5, 18.3±8.6, 15.8±5.4 negative emotions, respectively) (p>0.05). However, there appears to be an improvement at the end of the second week. Furthermore, the simple reaction time was significantly lower at the end of the second week, a moment where important decisions were taken about the mission, vs the week before (416±39 ms vs 459.8±39 ms respectively; p=0.030). Furthermore, the saliva cortisol was not significantly different (p>0.05) possibly due to important variations and seemed to reach a peak on the first day on the continent. However, the cortisol from the hair pre and post expedition increased significantly (2.4±0.5 pg/mg pre-expedition and 16.7±9.2 pg/mg post-expedition, p=0.013) showing important stress during the expedition. Moreover, no significant difference was observed on the grip strength except between after the mission on the continent and after the boat return trip (91.5±21 kg vs 85±19 kg, p=0.20). In conclusion, the cognitive performance does not seem to be affected during the expedition. Furthermore, it seems to increase for specific important events where the crew seemed to focus on the present task. The physiological stress does not seem to change significantly at specific moments, however, a global pre-post mission measure can be important and for this reason, for long-term missions, a pre-expedition baseline measure is important for crewmembers.

Keywords: Antarctica, cognitive performance, expedition, physiological adaptation, reaction time

Procedia PDF Downloads 230
5764 Molecular Approach for the Detection of Lactic Acid Bacteria in the Kenyan Spontaneously Fermented Milk, Mursik

Authors: John Masani Nduko, Joseph Wafula Matofari

Abstract:

Many spontaneously fermented milk products are produced in Kenya, where they are integral to the human diet and play a central role in enhancing food security and income generation via small-scale enterprises. Fermentation enhances product properties such as taste, aroma, shelf-life, safety, texture, and nutritional value. Some of these products have demonstrated therapeutic and probiotic effects although recent reports have linked some to death, biotoxin infections, and esophageal cancer. These products are mostly processed from poor quality raw materials under unhygienic conditions resulting to inconsistent product quality and limited shelf-lives. Though very popular, research on their processing technologies is low, and none of the products has been produced under controlled conditions using starter cultures. To modernize the processing technologies for these products, our study aims at describing the microbiology and biochemistry of a representative Kenyan spontaneously fermented milk product, Mursik using modern biotechnology (DNA sequencing) and their chemical composition. Moreover, co-creation processes reflecting stakeholders’ experiences on traditional fermented milk production technologies and utilization, ideals and senses of value, which will allow the generation of products based on common ground for rapid progress will be discussed. Knowledge of the value of clean starting raw material will be emphasized, the need for the definition of fermentation parameters highlighted, and standard equipment employment to attain controlled fermentation discussed. This presentation will review the available information regarding traditional fermented milk (Mursik) and highlight our current research work on the application of molecular approaches (metagenomics) for the valorization of Mursik production process through starter culture/ probiotic strains isolation and identification, and quality and safety aspects of the product. The importance of the research and future research areas on the same subject will also be highlighted.

Keywords: lactic acid bacteria, high throughput biotechnology, spontaneous fermentation, Mursik

Procedia PDF Downloads 276
5763 The Experience with SiC MOSFET and Buck Converter Snubber Design

Authors: Petr Vaculik

Abstract:

The newest semiconductor devices on the market are MOSFET transistors based on the silicon carbide – SiC. This material has exclusive features thanks to which it becomes a better switch than Si – silicon semiconductor switch. There are some special features that need to be understood to enable the device’s use to its full potential. The advantages and differences of SiC MOSFETs in comparison with Si IGBT transistors have been described in first part of this article. Second part describes driver for SiC MOSFET transistor and last part of article represents SiC MOSFET in the application of buck converter (step-down) and design of simple RC snubber.

Keywords: SiC, Si, MOSFET, IGBT, SBD, RC snubber

Procedia PDF Downloads 466
5762 Forms of Promoting and Disseminating Traditional Local Wisdom to Create Occupations among the Elderly in Nonmueng Community, Muang Sub-District, Baan Doong District, Udonthani Province

Authors: Pennapa Palapin

Abstract:

This research sought to study the traditional local wisdom and study the promotion and dissemination of traditional local wisdom in order to find the forms of promotion and dissemination of traditional local wisdom to create occupations among the elderly at Nonmueng Community, Muang Sub-District, Baan Dung District, UdonThani Province. The criterion used to select the research sample group was, being a person having a role involved in the promotion and dissemination of traditional local wisdom to create occupations among the elderly at Nonmueng Community, Muang Sub-District, Baan Dung District, UdonThani Province; being an experienced person whom the residents of Nonmueng Community find trustworthy; and having lived in Nonmueng Community for a long time so as to be able to see the development and change that occurs. A total of 16 people were selected. Data was gathered as a qualitative study, through semi-structured in-depth interviews. The collected data was then summarised and discussed according to the research objectives. Finally, the data was presented in a narrative format. Results found that the identifying traditional local wisdom of the community (which grew from the residents’ experience and beneficial usage in daily life, passed down from generation to generation) was the weaving of cloth and basketry. As for the manner of promotion and dissemination of traditional local wisdom, the skills were passed down through teaching by example to family members, relatives and others in the community. This was done by the elders or elderly members of the community. For the promotion and dissemination of traditional local wisdom to create occupations among the elderly, the traditional local wisdom should be supported in every way through participation of the community members. For example, establish a museum of traditional local wisdom for the collection of traditional local wisdom in various fields, both in the past and at present. This would be a source of pride for the community, in order to make traditional local wisdom widely known and to create income for the community’s elderly. Additional ways include exhibitions of products made by traditional local wisdom, finding both domestic and international markets, as well as building both domestic and international networks aiming to find opportunities to market products made by traditional local wisdom.

Keywords: traditional local wisdom, occupation, elderly, community

Procedia PDF Downloads 287
5761 Bridging the Digital Divide in India: Issus and Challenges

Authors: Parveen Kumar

Abstract:

The cope the rapid change of technology and to control the ephemeral rate of information generation, librarians along with their professional colleagues need to equip themselves as per the requirement of the electronic information society. E-learning is purely based on computer and communication technologies. The terminologies like computer based learning. It is the delivery of content via all electronic media through internet, internet, Extranets television broadcast, CD-Rom documents, etc. E-learning poses lot of issues in the transformation of literature or knowledge from the conventional medium to ICT based format and web based services.

Keywords: e-learning, digital libraries, online learning, electronic information society

Procedia PDF Downloads 495
5760 A Rapid Colorimetric Assay for Direct Detection of Unamplified Hepatitis C Virus RNA Using Gold Nanoparticles

Authors: M. Shemis, O. Maher, G. Casterou, F. Gauffre

Abstract:

Hepatitis C virus (HCV) is a major cause of chronic liver disease with a global 170 million chronic carriers at risk of developing liver cirrhosis and/or liver cancer. Egypt reports the highest prevalence of HCV worldwide. Currently, two classes of assays are used in the diagnosis and management of HCV infection. Despite the high sensitivity and specificity of the available diagnostic assays, they are time-consuming, labor-intensive, expensive, and require specialized equipment and highly qualified personal. It is therefore important for clinical and economic terms to develop a low-tech assay for the direct detection of HCV RNA with acceptable sensitivity and specificity, short turnaround time, and cost-effectiveness. Such an assay would be critical to control HCV in developing countries with limited resources and high infection rates, such as Egypt. The unique optical and physical properties of gold nanoparticles (AuNPs) have allowed the use of these nanoparticles in developing simple and rapid colorimetric assays for clinical diagnosis offering higher sensitivity and specificity than current detection techniques. The current research aims to develop a detection assay for HCV RNA using gold nanoparticles (AuNPs). Methods: 200 anti-HCV positive samples and 50 anti-HCV negative plasma samples were collected from Egyptian patients. HCV viral load was quantified using m2000rt (Abbott Molecular Inc., Des Plaines, IL). HCV genotypes were determined using multiplex nested RT- PCR. The assay is based on the aggregation of AuNPs in presence of the target RNA. Aggregation of AuNPs causes a color shift from red to blue. AuNPs were synthesized using citrate reduction method. Different sets of probes within the 5’ UTR conserved region of the HCV genome were designed, grafted on AuNPs and optimized for the efficient detection of HCV RNA. Results: The nano-gold assay could colorimetrically detect HCV RNA down to 125 IU/ml with sensitivity and specificity of 91.1% and 93.8% respectively. The turnaround time of the assay is < 30 min. Conclusions: The assay allows sensitive and rapid detection of HCV RNA and represents an inexpensive and simple point-of-care assay for resource-limited settings.

Keywords: HCV, gold nanoparticles, point of care, viral load

Procedia PDF Downloads 196
5759 Application of Strategic Management Tools

Authors: Abenezer Nigussie

Abstract:

Strategic control practice is a critical exercise, as it provides a sturdy influence towards firms or production partners to achieve the full implementation of effective predetermined plans. The importance of strategic control in a company is often measured by observing the relationship between strategic management and organizational performance. The conventional philosophy of strategic control in academia and the industry places significant emphasis on the ability to plan and execute initiatives. In contrast, the same emphasis on strategic management has received less attention in the housing industry. Although the pressures of project performance can often obscure the wider social, economic, and professional context in which strategic management is undertaken, it is these broad contextual areas that make strategic control a vital issue for construction businesses. Rapidly changing social and technological issues are creating an informed environment that will appear very different in the coming decades from what is experienced in today’s companies. Construction project activity is not adequately led by strategic management tools; projects are mostly executed through simple plans and schedules. The issues that this thesis addresses and solves involve the successful accompaniment of the construction project process through these strategic management tools. The second important aspect is an evaluation of project activity, which is mostly done through simple economic and technical valuation. However, during this research, effective strategic management tools are evaluated and suggested for the assessment of project activities. The research introduces a study of the current strategic management practices of construction companies and also presents the concept of strategic management and the areas that companies need to address to compete in the global market. A summary of an industry survey is documented along with the historical research that prompted the investigation of these topics with a focus on the implementation of tools. Strategic management is a concept that concerns making decisions and taking corrective actions to achieve the future goals and objectives of a company. The objective of this paper is to review the practice of strategic management in construction companies. Questionnaires were distributed to major construction companies listed under categories of each project capable of specifying the complete expression of strategic management tools. Findings of the research showed that the majority of development companies practice strategic management tools in the process and implementation of each tool.

Keywords: strategic management, management, analysis, project management

Procedia PDF Downloads 49