Search results for: technical trading signal
650 Understanding Complexity at Pre-Construction Stage in Project Planning of Construction Projects
Authors: Mehran Barani Shikhrobat, Roger Flanagan
Abstract:
The construction planning and scheduling based on using the current tools and techniques is resulted deterministic in nature (Gantt chart, CPM) or applying a very little probability of completion (PERT) for each task. However, every project embodies assumptions and influences and should start with a complete set of clearly defined goals and constraints that remain constant throughout the duration of the project. Construction planners continue to apply the traditional methods and tools of “hard” project management that were developed for “ideal projects,” neglecting the potential influence of complexity on the design and construction process. The aim of this research is to investigate the emergence and growth of complexity in project planning and to provide a model to consider the influence of complexity on the total project duration at the post-contract award pre-construction stage of a project. The literature review showed that complexity originates from different sources of environment, technical, and workflow interactions. They can be divided into two categories of complexity factors, first, project tasks, and second, project organisation management. Project tasks may originate from performance, lack of resources, or environmental changes for a specific task. Complexity factors that relate to organisation and management refer to workflow and interdependence of different parts. The literature review highlighted the ineffectiveness of traditional tools and techniques in planning for complexity. However, this research focus on understanding the fundamental causes of the complexity of construction projects were investigated through a questionnaire with industry experts. The results were used to develop a model that considers the core complexity factors and their interactions. System dynamics were used to investigate the model to consider the influence of complexity on project planning. Feedback from experts revealed 20 major complexity factors that impact project planning. The factors are divided into five categories known as core complexity factors. To understand the weight of each factor in comparison, the Analytical Hierarchy Process (AHP) analysis method is used. The comparison showed that externalities are ranked as the biggest influence across the complexity factors. The research underlines that there are many internal and external factors that impact project activities and the project overall. This research shows the importance of considering the influence of complexity on the project master plan undertaken at the post-contract award pre-construction phase of a project.Keywords: project planning, project complexity measurement, planning uncertainty management, project risk management, strategic project scheduling
Procedia PDF Downloads 144649 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments
Authors: Ana Londral, Burcu Demiray, Marcus Cheetham
Abstract:
Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation
Procedia PDF Downloads 283648 Engineering of Reagentless Fluorescence Biosensors Based on Single-Chain Antibody Fragments
Authors: Christian Fercher, Jiaul Islam, Simon R. Corrie
Abstract:
Fluorescence-based immunodiagnostics are an emerging field in biosensor development and exhibit several advantages over traditional detection methods. While various affinity biosensors have been developed to generate a fluorescence signal upon sensing varying concentrations of analytes, reagentless, reversible, and continuous monitoring of complex biological samples remains challenging. Here, we aimed to genetically engineer biosensors based on single-chain antibody fragments (scFv) that are site-specifically labeled with environmentally sensitive fluorescent unnatural amino acids (UAA). A rational design approach resulted in quantifiable analyte-dependent changes in peak fluorescence emission wavelength and enabled antigen detection in vitro. Incorporation of a polarity indicator within the topological neighborhood of the antigen-binding interface generated a titratable wavelength blueshift with nanomolar detection limits. In order to ensure continuous analyte monitoring, scFv candidates with fast binding and dissociation kinetics were selected from a genetic library employing a high-throughput phage display and affinity screening approach. Initial rankings were further refined towards rapid dissociation kinetics using bio-layer interferometry (BLI) and surface plasmon resonance (SPR). The most promising candidates were expressed, purified to homogeneity, and tested for their potential to detect biomarkers in a continuous microfluidic-based assay. Variations of dissociation kinetics within an order of magnitude were achieved without compromising the specificity of the antibody fragments. This approach is generally applicable to numerous antibody/antigen combinations and currently awaits integration in a wide range of assay platforms for one-step protein quantification.Keywords: antibody engineering, biosensor, phage display, unnatural amino acids
Procedia PDF Downloads 147647 The ‘Quartered Head Technique’: A Simple, Reliable Way of Maintaining Leg Length and Offset during Total Hip Arthroplasty
Authors: M. Haruna, O. O. Onafowokan, G. Holt, K. Anderson, R. G. Middleton
Abstract:
Background: Requirements for satisfactory outcomes following total hip arthroplasty (THA) include restoration of femoral offset, version, and leg length. Various techniques have been described for restoring these biomechanical parameters, with leg length restoration being the most predominantly described. We describe a “quartered head technique” (QHT) which uses a stepwise series of femoral head osteotomies to identify and preserve the centre of rotation of the femoral head during THA in order to ensure reconstruction of leg length, offset and stem version, such that hip biomechanics are restored as near to normal as possible. This study aims to identify whether using the QHT during hip arthroplasty effectively restores leg length and femoral offset to within acceptable parameters. Methods: A retrospective review of 206 hips was carried out, leaving 124 hips in the final analysis. Power analysis indicated a minimum of 37 patients required. All operations were performed using an anterolateral approach by a single surgeon. All femoral implants were cemented, collarless, polished double taper CPT® stems (Zimmer, Swindon, UK). Both cemented, and uncemented acetabular components were used (Zimmer, Swindon, UK). Leg length, version, and offset were assessed intra-operatively and reproduced using the QHT. Post-operative leg length and femoral offset were determined and compared with the contralateral native hip, and the difference was then calculated. For the determination of leg length discrepancy (LLD), we used the method described by Williamson & Reckling, which has been shown to be reproducible with a measurement error of ±1mm. As a reference, the inferior margin of the acetabular teardrop and the most prominent point of the lesser trochanter were used. A discrepancy of less than 6mm LLD was chosen as acceptable. All peri-operative radiographs were assessed by two independent observers. Results: The mean absolute post-operative difference in leg length from the contralateral leg was +3.58mm. 84% of patients (104/124) had LLD within ±6mm of the contralateral limb. The mean absolute post-operative difference in offset from contralateral leg was +3.88mm (range -15 to +9mm, median 3mm). 90% of patients (112/124) were within ±6mm offset of the contralateral limb. There was no statistical difference noted between observer measurements. Conclusion: The QHT provides a simple, inexpensive yet effective method of maintaining femoral leg length and offset during total hip arthroplasty. Combining this technique with pre-operative templating or other techniques described may enable surgeons to reduce even further the discrepancies between pre-operative state and post-operative outcome.Keywords: leg length discrepancy, technical tip, total hip arthroplasty, operative technique
Procedia PDF Downloads 84646 Silicon-Photonic-Sensor System for Botulinum Toxin Detection in Water
Authors: Binh T. T. Nguyen, Zhenyu Li, Eric Yap, Yi Zhang, Ai-Qun Liu
Abstract:
Silicon-photonic-sensor system is an emerging class of analytical technologies that use evanescent field wave to sensitively measure the slight difference in the surrounding environment. The wavelength shift induced by local refractive index change is used as an indicator in the system. These devices can be served as sensors for a wide variety of chemical or biomolecular detection in clinical and environmental fields. In our study, a system including a silicon-based micro-ring resonator, microfluidic channel, and optical processing is designed, fabricated for biomolecule detection. The system is demonstrated to detect Clostridium botulinum type A neurotoxin (BoNT) in different water sources. BoNT is one of the most toxic substances known and relatively easily obtained from a cultured bacteria source. The toxin is extremely lethal with LD50 of about 0.1µg/70kg intravenously, 1µg/ 70 kg by inhalation, and 70µg/kg orally. These factors make botulinum neurotoxins primary candidates as bioterrorism or biothreat agents. It is required to have a sensing system which can detect BoNT in a short time, high sensitive and automatic. For BoNT detection, silicon-based micro-ring resonator is modified with a linker for the immobilization of the anti-botulinum capture antibody. The enzymatic reaction is employed to increase the signal hence gains sensitivity. As a result, a detection limit to 30 pg/mL is achieved by our silicon-photonic sensor within a short period of 80 min. The sensor also shows high specificity versus the other type of botulinum. In the future, by designing the multifunctional waveguide array with fully automatic control system, it is simple to simultaneously detect multi-biomaterials at a low concentration within a short period. The system has a great potential to apply for online, real-time and high sensitivity for the label-free bimolecular rapid detection.Keywords: biotoxin, photonic, ring resonator, sensor
Procedia PDF Downloads 118645 Sliding Mode Power System Stabilizer for Synchronous Generator Stability Improvement
Authors: J. Ritonja, R. Brezovnik, M. Petrun, B. Polajžer
Abstract:
Many modern synchronous generators in power systems are extremely weakly damped. The reasons are cost optimization of the machine building and introduction of the additional control equipment into power systems. Oscillations of the synchronous generators and related stability problems of the power systems are harmful and can lead to failures in operation and to damages. The only useful solution to increase damping of the unwanted oscillations represents the implementation of the power system stabilizers. Power system stabilizers generate the additional control signal which changes synchronous generator field excitation voltage. Modern power system stabilizers are integrated into static excitation systems of the synchronous generators. Available commercial power system stabilizers are based on linear control theory. Due to the nonlinear dynamics of the synchronous generator, current stabilizers do not assure optimal damping of the synchronous generator’s oscillations in the entire operating range. For that reason the use of the robust power system stabilizers which are convenient for the entire operating range is reasonable. There are numerous robust techniques applicable for the power system stabilizers. In this paper the use of sliding mode control for synchronous generator stability improvement is studied. On the basis of the sliding mode theory, the robust power system stabilizer was developed. The main advantages of the sliding mode controller are simple realization of the control algorithm, robustness to parameter variations and elimination of disturbances. The advantage of the proposed sliding mode controller against conventional linear controller was tested for damping of the synchronous generator oscillations in the entire operating range. Obtained results show the improved damping in the entire operating range of the synchronous generator and the increase of the power system stability. The proposed study contributes to the progress in the development of the advanced stabilizer, which will replace conventional linear stabilizers and improve damping of the synchronous generators.Keywords: control theory, power system stabilizer, robust control, sliding mode control, stability, synchronous generator
Procedia PDF Downloads 227644 Integrating One Health Approach with National Policies to Improve Health Security post-COVID-19 in Vietnam
Authors: Yasser Sanad, Thu Trang Dao
Abstract:
Introduction: Implementing the One Health (OH) approach requires an integrated, interdisciplinary, and cross-sectoral methodology. OH is a key tool for developing and implementing programs and projects and includes developing ambitious policies that consider the common needs and benefits of human, animal, plant, and ecosystem health. OH helps humanity readjust its path to environmentally friendly and impartial sustainability. As co-leader of the Global Health Security Agenda’s Zoonotic Disease Action Package, Vietnam pioneered a strong OH approach to effectively address early waves of the COVID-19 outbreak in-country. Context and Aim: The repeated surges in COVID-19 in Vietnam challenged the capabilities of the national system and disclosed the gaps in multi-sectoral coordination and resilience. To address this, FHI 360 advocated for the standardization of the OH platform by government actors to increase the resiliency of the system during and post COVID-19. Methods: FHI 360 coordinated technical resources to develop and implement evidence-based OH policies, promoting high-level policy dialogue between the Ministries of Health, Agriculture, and the Environment, and policy research to inform developed policies and frameworks. Through discussions, an OH-building Partnership (OHP) was formed, linking climate change, the environment, and human and animal health. Findings: The OHP Framework created a favorable policy environment within and between sectors, as well as between governments and international health security partners. It also promoted strategic dialogue, resource mobilization, policy advocacy, and integration of international systems with National Steering Committees to ensure accountability and emphasize national ownership. Innovative contribution to policy, practice and/or research: OHP was an effective evidence-based research-to-policy platform linking to the National One Health Strategic Plan (2021-2025). Collectively they serve as a national framework for the implementation and monitoring of OH activities. Through the adoption of policies and plans, the risk of zoonotic pathogens, environmental agent spillover, and antimicrobial resistance can be minimized through strengthening multi-sectoral OH collaboration for health security.Keywords: one health, national policies, health security, COVID-19, Vietnam
Procedia PDF Downloads 107643 A Metric to Evaluate Conventional and Electrified Vehicles in Terms of Customer-Oriented Driving Dynamics
Authors: Stephan Schiffer, Andreas Kain, Philipp Wilde, Maximilian Helbing, Bernard Bäker
Abstract:
Automobile manufacturers progressively focus on a downsizing strategy to meet the EU's CO2 requirements concerning type-approval consumption cycles. The reduction in naturally aspirated engine power is compensated by increased levels of turbocharging. By downsizing conventional engines, CO2 emissions are reduced. However, it also implicates major challenges regarding longitudinal dynamic characteristics. An example of this circumstance is the delayed turbocharger-induced torque reaction which leads to a partially poor response behavior of the vehicle during acceleration operations. That is why it is important to focus conventional drive train design on real customer driving again. The currently considered dynamic maneuvers like the acceleration time 0-100 km/h discussed by journals and car manufacturers describe longitudinal dynamics experienced by a driver inadequately. For that reason we present the realization and evaluation of a comprehensive proband study. Subjects are provided with different vehicle concepts (electrified vehicles, vehicles with naturally aspired engines and vehicles with different concepts of turbochargers etc.) in order to find out which dynamic criteria are decisive for a subjectively strong acceleration and response behavior of a vehicle. Subsequently, realistic acceleration criteria are derived. By weighing the criteria an evaluation metric is developed to objectify customer-oriented transient dynamics. Fully-electrified vehicles are the benchmark in terms of customer-oriented longitudinal dynamics. The electric machine provides the desired torque almost without delay. This advantage compared to combustion engines is especially noticeable at low engine speeds. In conclusion, we will show the degree to which extent customer-relevant longitudinal dynamics of conventional vehicles can be approximated to electrified vehicle concepts. Therefore, various technical measures (turbocharger concepts, 48V electrical chargers etc.) and drive train designs (e.g. varying the final drive) are presented and evaluated in order to strengthen the vehicle’s customer-relevant transient dynamics. As a rating size the newly developed evaluation metric will be used.Keywords: 48V, customer-oriented driving dynamics, electric charger, electrified vehicles, vehicle concepts
Procedia PDF Downloads 407642 DNA Methylation Changes in Response to Ocean Acidification at the Time of Larval Metamorphosis in the Edible Oyster, Crassostrea hongkongensis
Authors: Yong-Kian Lim, Khan Cheung, Xin Dang, Steven Roberts, Xiaotong Wang, Vengatesen Thiyagarajan
Abstract:
Unprecedented rate of increased CO₂ level in the ocean and the subsequent changes in carbonate system including decreased pH, known as ocean acidification (OA), is predicted to disrupt not only the calcification process but also several other physiological and developmental processes in a variety of marine organisms, including edible oysters. Nonetheless, not all species are vulnerable to those OA threats, e.g., some species may be able to cope with OA stress using environmentally induced modifications on gene and protein expressions. For example, external environmental stressors, including OA, can influence the addition and removal of methyl groups through epigenetic modification (e.g., DNA methylation) process to turn gene expression “on or off” as part of a rapid adaptive mechanism to cope with OA. In this study, the above hypothesis was tested through testing the effect of OA, using decreased pH 7.4 as a proxy, on the DNA methylation pattern of an endemic and a commercially important estuary oyster species, Crassostrea hongkongensis, at the time of larval habitat selection and metamorphosis. Larval growth rate did not differ between control pH 8.1 and treatment pH 7.4. The metamorphosis rate of the pediveliger larvae was higher at pH 7.4 than those in control pH 8.1; however, over one-third of the larvae raised at pH 7.4 failed to attach to an optimal substrate as defined by biofilm presence. During larval development, a total of 130 genes were differentially methylated across the two treatments. The differential methylation in the larval genes may have partially accounted for the higher metamorphosis success rate under decreased pH 7.4 but with poor substratum selection ability. Differentially methylated loci were concentrated in the exon regions and appear to be associated with cytoskeletal and signal transduction, oxidative stress, metabolic processes, and larval metamorphosis, which implies the high potential of C. hongkongensis larvae to acclimate and adapt through non-genetic ways to OA threats within a single generation.Keywords: adaptive plasticity, DNA methylation, larval metamorphosis, ocean acidification
Procedia PDF Downloads 141641 Efficient Residual Road Condition Segmentation Network Based on Reconstructed Images
Authors: Xiang Shijie, Zhou Dong, Tian Dan
Abstract:
This paper focuses on the application of real-time semantic segmentation technology in complex road condition recognition, aiming to address the critical issue of how to improve segmentation accuracy while ensuring real-time performance. Semantic segmentation technology has broad application prospects in fields such as autonomous vehicle navigation and remote sensing image recognition. However, current real-time semantic segmentation networks face significant technical challenges and optimization gaps in balancing speed and accuracy. To tackle this problem, this paper conducts an in-depth study and proposes an innovative Guided Image Reconstruction Module. By resampling high-resolution images into a set of low-resolution images, this module effectively reduces computational complexity, allowing the network to more efficiently extract features within limited resources, thereby improving the performance of real-time segmentation tasks. In addition, a dual-branch network structure is designed in this paper to fully leverage the advantages of different feature layers. A novel Hybrid Attention Mechanism is also introduced, which can dynamically capture multi-scale contextual information and effectively enhance the focus on important features, thus improving the segmentation accuracy of the network in complex road condition. Compared with traditional methods, the proposed model achieves a better balance between accuracy and real-time performance and demonstrates competitive results in road condition segmentation tasks, showcasing its superiority. Experimental results show that this method not only significantly improves segmentation accuracy while maintaining real-time performance, but also remains stable across diverse and complex road conditions, making it highly applicable in practical scenarios. By incorporating the Guided Image Reconstruction Module, dual-branch structure, and Hybrid Attention Mechanism, this paper presents a novel approach to real-time semantic segmentation tasks, which is expected to further advance the development of this field.Keywords: hybrid attention mechanism, image reconstruction, real-time, road status recognition
Procedia PDF Downloads 25640 Camptothecin Promotes ROS-Mediated G2/M Phase Cell Cycle Arrest, Resulting from Autophagy-Mediated Cytoprotection
Authors: Rajapaksha Gedara Prasad Tharanga Jayasooriya, Matharage Gayani Dilshara, Yung Hyun Choi, Gi-Young Kim
Abstract:
Camptothecin (CPT) is a quinolone alkaloid which inhibits DNA topoisomerase I that induces cytotoxicity in a variety of cancer cell lines. We previously showed that CPT effectively inhibited invasion of prostate cancer cells and also combined treatment with subtoxic doses of CPT and TNF-related apoptosis-inducing ligand (TRAIL) potentially enhanced apoptosis in a caspase-dependent manner in hepatoma cancer cells. Here, we found that treatment with CPT caused an irreversible cell cycle arrest in the G2/M phase. CPT-induced cell cycle arrest was associated with a decrease in protein levels of cell division cycle 25C (Cdc25C) and increased the level of cyclin B and p21. The CPT-induced decrease in Cdc25C was blocked in the presence of proteasome inhibitor MG132, thus reversed the cell cycle arrest. In addition to that treatment of CPT-increased phosphorylation of Cdc25C was the resulted of activation of checkpoint kinase 2 (Chk2), which was associated with phosphorylation of ataxia telangiectasia-mutated. Interestingly CPT induced G2/M phase of the cell cycle arrest is reactive oxygen species (ROS) dependent where ROS inhibitors NAC and GSH reversed the CPT-induced cell cycle arrest. These results further confirm by using transient knockdown of nuclear factor-erythroid 2-related factor 2 (Nrf2) since it regulates the production of ROS. Our data reveal that treatment of siNrf2 increased the ROS level as well as further increased the CPT induce G2/M phase cell cycle arrest. Our data also indicate CPT-enhanced cell cycle arrest through the extracellular signal-regulated kinase (ERK) and the c-Jun N-terminal kinase (JNK) pathway. Inhibitors of ERK and JNK more decreased the Cdc25C expression and protein expression of p21 and cyclin B. These findings indicate that Chk2-mediated phosphorylation of Cdc25C plays a major role in G2/M arrest by CPT.Keywords: camptothecin, cell cycle, checkpoint kinase 2, nuclear factor-erythroid 2-related factor 2, reactive oxygen species
Procedia PDF Downloads 442639 Brain-Computer Interfaces That Use Electroencephalography
Authors: Arda Ozkurt, Ozlem Bozkurt
Abstract:
Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.Keywords: BCI, EEG, non-invasive, spatial resolution
Procedia PDF Downloads 75638 Familiarity with Engineering Project Management And Their Duties In Projects
Authors: Mokhtar Nikgoo
Abstract:
Today's industrial world has undergone tremendous changes in certain periods. These changes are called environmental changes. And they have a direct impact on organizations and bodies. Therefore, the importance of knowing these changes is clear. This importance has caused the manufacturing organizations to move towards multiple products and constantly change and expand their system. This research tries to show how the organization moves in this category by defining the basic steps of implementing a project. One of the most important features of a hard-to-order production organization is the definition of different production projects from different customers. Therefore, the lack of sufficient understanding of the type of work causes the project to be defined for the organization in question, and the managers of the organization (in every organizational level) are constantly involved with different projects. In the implementation of the production project of the aforementioned organizations, directing the facilities and people of the organization towards the implementation of the project is of particular importance. Therefore, it is felt necessary to define the project manager and his basic duties. Considering the importance of this topic, the project chapter deals with project management and its importance and examines all the different issues in that category from the perspective of implementation. A project includes certain activities of the organization that require the use of different resources and all the activities of the organization in order to implement the project with defined facilities and at the designated times. Project management is planning, organizing and controlling the organization's resources for a short-term goal that has been created for short-term and medium-term goals and objectives. Project management has the important task of centering and integrating (coordinating) task and line managers. In other words, project management requires having a strong and appropriate relationship with the internal people of the system to carry out the assigned activities and must have a general and technical knowledge related to various activities in the project environment. It seems that everything with project management in It is communication. One of the characteristics of production organizations under the order is the relationship between the customer (customers) and the organization until the completion of the defined project. Due to the nature of the work, it is necessary for a person to establish this relationship between the client and the organization's people and to establish this relationship in such a way that it does not cause a lack of coordination in the organization's activities. Therefore, project management has a very important role at this stage, because the relationship between the client and his organization will be any problems and problems and points of view that the client has, he must inform the management so that he can implement the cases with its analysis and special processes. To be transferred to other departments and line managers.Keywords: project management, crisis management, project delays bill, project duration
Procedia PDF Downloads 60637 Pattern of Anisometropia, Management and Outcome of Anisometropic Amblyopia
Authors: Husain Rajib, T. H. Sheikh, D. G. Jewel
Abstract:
Background: Amblyopia is a frequent cause of monocular blindness in children. It can be unilateral or bilateral reduction of best corrected visual acuity associated with decrement in visual processing, accomodation, motility, spatial perception or spatial projection. Anisometropia is an important risk factor for amblyopia that develops when unequal refractive error causes the image to be blurred in the critical developmental period and central inhibition of the visual signal originating from the affected eye associated with significant visual problems including anisokonia, strabismus, and reduced stereopsis. Methods: It is a prospective hospital based study of newly diagnosed of amblyopia seen at the pediatric clinic of Chittagong Eye Infirmary & Training Complex. There were 50 anisometropic amblyopia subjects were examined & questionnaire was piloted. Included were all patients diagnosed with refractive amblyopia between 3 to 13 years, without previous amblyopia treatment, and whose parents were interested to participate in the study. Patients diagnosed with strabismic amblyopia were excluded. Patients were first corrected with the best correction for a month. When the VA in the amblyopic eye did not improve over month, then occlusion treatment was started. Occlusion was done daily for 6-8 hours (full time) together with vision therapy. The occlusion was carried out for 3 months. Results: In this study about 8% subjects had anisometropia from myopia, 18% from hyperopia, 74% from astigmatism. The initial mean visual acuity was 0.74 ± 0.39 Log MAR and after intervention of amblyopia therapy with active vision therapy mean visual acuity was 0.34 ± 0.26 Log MAR. About 94% of subjects were improving at least two lines. The depth of amblyopia associated with type of anisometropic refractive error and magnitude of Anisometropia (p<0.005). By doing this study 10% mild amblyopia, 64% moderate and 26% severe amblyopia were found. Binocular function also decreases with magnitude of Anisometropia. Conclusion: Anisometropic amblyopia is a most important factor in pediatric age group because it can lead to visual impairment. Occlusion therapy with at least one instructed hour of active visual activity practiced out of school hours was effective in anisometropic amblyopes who were diagnosed at the age of 8 years and older, and the patients complied well with the treatment.Keywords: refractive error, anisometropia, amblyopia, strabismic amblyopia
Procedia PDF Downloads 276636 Application of Artificial Intelligence in Market and Sales Network Management: Opportunities, Benefits, and Challenges
Authors: Mohamad Mahdi Namdari
Abstract:
In today's rapidly changing and evolving business competition, companies and organizations require advanced and efficient tools to manage their markets and sales networks. Big data analysis, quick response in competitive markets, process and operations optimization, and forecasting customer behavior are among the concerns of executive managers. Artificial intelligence, as one of the emerging technologies, has provided extensive capabilities in this regard. The use of artificial intelligence in market and sales network management can lead to improved efficiency, increased decision-making accuracy, and enhanced customer satisfaction. Specifically, AI algorithms can analyze vast amounts of data, identify complex patterns, and offer strategic suggestions to improve sales performance. However, many companies are still distant from effectively leveraging this technology, and those that do face challenges in fully exploiting AI's potential in market and sales network management. It appears that the general public's and even the managerial and academic communities' lack of knowledge of this technology has caused the managerial structure to lag behind the progress and development of artificial intelligence. Additionally, high costs, fear of change and employee resistance, lack of quality data production processes, the need for updating structures and processes, implementation issues, the need for specialized skills and technical equipment, and ethical and privacy concerns are among the factors preventing widespread use of this technology in organizations. Clarifying and explaining this technology, especially to the academic, managerial, and elite communities, can pave the way for a transformative beginning. The aim of this research is to elucidate the capacities of artificial intelligence in market and sales network management, identify its opportunities and benefits, and examine the existing challenges and obstacles. This research aims to leverage AI capabilities to provide a framework for enhancing market and sales network performance for managers. The results of this research can help managers and decision-makers adopt more effective strategies for business growth and development by better understanding the capabilities and limitations of artificial intelligence.Keywords: artificial intelligence, market management, sales network, big data analysis, decision-making, digital marketing
Procedia PDF Downloads 47635 Estimation of the Exergy-Aggregated Value Generated by a Manufacturing Process Using the Theory of the Exergetic Cost
Authors: German Osma, Gabriel Ordonez
Abstract:
The production of metal-rubber spares for vehicles is a sequential process that consists in the transformation of raw material through cutting activities and chemical and thermal treatments, which demand electricity and fossil fuels. The energy efficiency analysis for these cases is mostly focused on studying of each machine or production step, but is not common to study of the quality of the production process achieves from aggregated value viewpoint, which can be used as a quality measurement for determining of impact on the environment. In this paper, the theory of exergetic cost is used for determining of aggregated exergy to three metal-rubber spares, from an exergy analysis and thermoeconomic analysis. The manufacturing processing of these spares is based into batch production technique, and therefore is proposed the use of this theory for discontinuous flows from of single models of workstations; subsequently, the complete exergy model of each product is built using flowcharts. These models are a representation of exergy flows between components into the machines according to electrical, mechanical and/or thermal expressions; they determine the demanded exergy to produce the effective transformation in raw materials (aggregated exergy value), the exergy losses caused by equipment and irreversibilities. The energy resources of manufacturing process are electricity and natural gas. The workstations considered are lathes, punching presses, cutters, zinc machine, chemical treatment tanks, hydraulic vulcanizing presses and rubber mixer. The thermoeconomic analysis was done by workstation and by spare; first of them describes the operation of the components of each machine and where the exergy losses are; while the second of them estimates the exergy-aggregated value for finished product and wasted feedstock. Results indicate that exergy efficiency of a mechanical workstation is between 10% and 60% while this value in the thermal workstations is less than 5%; also that each effective exergy-aggregated value is one-thirtieth of total exergy required for operation of manufacturing process, which amounts approximately to 2 MJ. These troubles are caused mainly by technical limitations of machines, oversizing of metal feedstock that demands more mechanical transformation work, and low thermal insulation of chemical treatment tanks and hydraulic vulcanizing presses. From established information, in this case, it is possible to appreciate the usefulness of theory of exergetic cost for analyzing of aggregated value in manufacturing processes.Keywords: exergy-aggregated value, exergy efficiency, thermoeconomics, exergy modeling
Procedia PDF Downloads 172634 Identification of Accumulated Hydrocarbon Based on Heat Propagation Analysis in Order to Develop Mature Field: Case Study in South Sumatra Basin, Indonesia
Authors: Kukuh Suprayogi, Muhamad Natsir, Olif Kurniawan, Hot Parulian, Bayu Fitriana, Fery Mustofa
Abstract:
The new approach by utilizing the heat propagation analysis carried out by studying and evaluating the effect of the presence of hydrocarbons to the flow of heat that goes from the bottom surface to surface. Heat propagation is determined by the thermal conductivity of rocks. The thermal conductivity of rock itself is a quantity that describes the ability of a rock to deliver heat. This quantity depends on the constituent rock lithology, large porosity, and pore fluid filler. The higher the thermal conductivity of a rock, the more easily the flow of heat passing through these rocks. With the same sense, the heat flow will more easily pass through the rock when the rock is filled with water than hydrocarbons, given the nature of the hydrocarbons having more insulator against heat. The main objective of this research is to try to make the model the heat propagation calculations in degrees Celsius from the subsurface to the surface which is then compared with the surface temperature is measured directly at the point of location. In calculating the propagation of heat, we need to first determine the thermal conductivity of rocks, where the rocks at the point calculation are not composed of homogeneous but consist of strata. Therefore, we need to determine the mineral constituent and porosity values of each stratum. As for the parameters of pore fluid filler, we assume that all the pores filled with water. Once we get a thermal conductivity value of each unit of the rock, then we begin to model the propagation of heat profile from the bottom to the surface. The initial value of the temperature that we use comes from the data bottom hole temperature (BHT) is obtained from drilling results. Results of calculations per depths the temperature is displayed in plotting temperature versus depth profiles that describe the propagation of heat from the bottom of the well to the surface, note that pore fluid is water. In the technical implementation, we can identify the magnitude of the effect of hydrocarbons in reducing the amount of heat that crept to the surface based on the calculation of propagation of heat at a certain point and compared with measurements of surface temperature at that point, assuming that the surface temperature measured is the temperature that comes from the asthenosphere. This publication proves that the accumulation of hydrocarbon can be identified by analysis of heat propagation profile which could be a method for identifying the presence of hydrocarbons.Keywords: thermal conductivity, rock, pore fluid, heat propagation
Procedia PDF Downloads 109633 Neural Correlates of Attention Bias to Threat during the Emotional Stroop Task in Schizophrenia
Authors: Camellia Al-Ibrahim, Jenny Yiend, Sukhwinder S. Shergill
Abstract:
Background: Attention bias to threat play a role in the development, maintenance, and exacerbation of delusional beliefs in schizophrenia in which patients emphasize the threatening characteristics of stimuli and prioritise them for processing. Cognitive control deficits arise when task-irrelevant emotional information elicits attentional bias and obstruct optimal performance. This study is investigating neural correlates of interference effect of linguistic threat and whether these effects are independent of delusional severity. Methods: Using an event-related functional magnetic resonance imaging (fMRI), neural correlates of interference effect of linguistic threat during the emotional Stroop task were investigated and compared patients with schizophrenia with high (N=17) and low (N=16) paranoid symptoms and healthy controls (N=20). Participants were instructed to identify the font colour of each word presented on the screen as quickly and accurately as possible. Stimuli types vary between threat-relevant, positive and neutral words. Results: Group differences in whole brain effects indicate decreased amygdala activity in patients with high paranoid symptoms compared with low paranoid patients and healthy controls. Regions of interest analysis (ROI) validated our results within the amygdala and investigated changes within the striatum showing a pattern of reduced activation within the clinical group compared to healthy controls. Delusional severity was associated with significant decreased neural activity in the striatum within the clinical group. Conclusion: Our findings suggest that the emotional interference mediated by the amygdala and striatum may reduce responsiveness to threat-related stimuli in schizophrenia and that attenuation of fMRI Blood-oxygen-level dependent (BOLD) signal within these areas might be influenced by the severity of delusional symptoms.Keywords: attention bias, fMRI, Schizophrenia, Stroop
Procedia PDF Downloads 203632 Applying the Underwriting Technique to Analyze and Mitigate the Credit Risks in Construction Project Management
Authors: Hai Chien Pham, Thi Phuong Anh Vo, Chansik Park
Abstract:
Risks management in construction projects is important to ensure the positive feasibility of the projects in which financial risks are most concerned while construction projects always run on a credit basis. Credit risks, therefore, require unique and technical tools to be well managed. Underwriting technique in credit risks, in its most basic sense, refers to the process of evaluating the risks and the potential exposure of losses. Risks analysis and underwriting are applied as a must in banks and financial institutions who are supporters for constructions projects when required. Recently, construction organizations, especially contractors, have recognized the significant increasing of credit risks which caused negative impacts to project performance and profit of construction firms. Despite the successful application of underwriting in banks and financial institutions for many years, there are few contractors who are applying this technique to analyze and mitigate the credit risks of their potential owners before signing contracts with them for delivering their performed services. Thus, contractors have taken credit risks during project implementation which might be not materialized due to the bankruptcy and/or protracted default made by their owners. With this regard, this study proposes a model using the underwriting technique for contractors to analyze and assess credit risks of their owners before making final decisions for the potential construction contracts. Contractor’s underwriters are able to analyze and evaluate the subjects such as owner, country, sector, payment terms, financial figures and their related concerns of the credit limit requests in details based on reliable information sources, and then input into the proposed model to have the Overall Assessment Score (OAS). The OAS is as a benchmark for the decision makers to grant the proper limits for the project. The proposed underwriting model is validated by 30 subjects in Asia Pacific region within 5 years to achieve their OAS, and then compare output OAS with their own practical performance in order to evaluate the potential of underwriting model for analyzing and assessing credit risks. The results revealed that the underwriting would be a powerful method to assist contractors in making precise decisions. The contribution of this research is to allow the contractors firstly to develop their own credit risk management model for proactively preventing the credit risks of construction projects and continuously improve and enhance the performance of this function during project implementation.Keywords: underwriting technique, credit risk, risk management, construction project
Procedia PDF Downloads 210631 High Performance Liquid Cooling Garment (LCG) Using ThermoCore
Authors: Venkat Kamavaram, Ravi Pare
Abstract:
Modern warfighters experience extreme environmental conditions in many of their operational and training activities. In temperatures exceeding 95°F, the body’s temperature regulation can no longer cool through convection and radiation. In this case, the only cooling mechanism is evaporation. However, evaporative cooling is often compromised by excessive humidity. Natural cooling mechanisms can be further compromised by clothing and protective gear, which trap hot air and moisture close to the body. Creating an efficient heat extraction apparel system that is also lightweight without hindering dexterity or mobility of personnel working in extreme temperatures is a difficult technical challenge and one that needs to be addressed to increase the probability for the future success of the US military. To address this challenge, Oceanit Laboratories, Inc. has developed and patented a Liquid Cooled Garment (LCG) more effective than any on the market today. Oceanit’s LCG is a form-fitting garment with a network of thermally conductive tubes that extracts body heat and can be worn under all authorized and chemical/biological protective clothing. Oceanit specifically designed and developed ThermoCore®, a thermally conductive polymer, for use in this apparel, optimizing the product for thermal conductivity, mechanical properties, manufacturability, and performance temperatures. Thermal Manikin tests were conducted in accordance with the ASTM test method, ASTM F2371, Standard Test Method for Measuring the Heat Removal Rate of Personal Cooling Systems Using a Sweating Heated Manikin, in an environmental chamber using a 20-zone sweating thermal manikin. Manikin test results have shown that Oceanit’s LCG provides significantly higher heat extraction under the same environmental conditions than the currently fielded Environmental Control Vest (ECV) while at the same time reducing the weight. Oceanit’s LCG vests performed nearly 30% better in extracting body heat while weighing 15% less than the ECV. There are NO cooling garments in the market that provide the same thermal extraction performance, form-factor, and reduced weight as Oceanit’s LCG. The two cooling garments that are commercially available and most commonly used are the Environmental Control Vest (ECV) and the Microclimate Cooling Garment (MCG).Keywords: thermally conductive composite, tubing, garment design, form fitting vest, thermocore
Procedia PDF Downloads 116630 Thermo-Hydro-Mechanical-Chemical Coupling in Enhanced Geothermal Systems: Challenges and Opportunities
Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo
Abstract:
Geothermal reservoirs (GTRs) have garnered global recognition as a sustainable energy source. The Thermo-Hydro-Mechanical-Chemical (THMC) integration coupling proves to be a practical and effective method for optimizing production in GTRs. The study outcomes demonstrate that THMC coupling serves as a versatile and valuable tool, offering in-depth insights into GTRs and enhancing their operational efficiency. This is achieved through temperature analysis and pressure changes and their impacts on mechanical properties, structural integrity, fracture aperture, permeability, and heat extraction efficiency. Moreover, THMC coupling facilitates potential benefits assessment and risks associated with different geothermal technologies, considering the complex thermal, hydraulic, mechanical, and chemical interactions within the reservoirs. However, THMC-coupling utilization in GTRs presents a multitude of challenges. These challenges include accurately modeling and predicting behavior due to the interconnected nature of processes, limited data availability leading to uncertainties, induced seismic events risks to nearby communities, scaling and mineral deposition reducing operational efficiency, and reservoirs' long-term sustainability. In addition, material degradation, environmental impacts, technical challenges in monitoring and control, accurate assessment of resource potential, and regulatory and social acceptance further complicate geothermal projects. Addressing these multifaceted challenges is crucial for successful geothermal energy resources sustainable utilization. This paper aims to illuminate the challenges and opportunities associated with THMC coupling in enhanced geothermal systems. Practical solutions and strategies for mitigating these challenges are discussed, emphasizing the need for interdisciplinary approaches, improved data collection and modeling techniques, and advanced monitoring and control systems. Overcoming these challenges is imperative for unlocking the full potential of geothermal energy making a substantial contribution to the global energy transition and sustainable development.Keywords: geothermal reservoirs, THMC coupling, interdisciplinary approaches, challenges and opportunities, sustainable utilization
Procedia PDF Downloads 70629 Modernization of Translation Studies Curriculum at Higher Education Level in Armenia
Authors: A. Vahanyan
Abstract:
The paper touches upon the problem of revision and modernization of the current curriculum on translation studies at the Armenian Higher Education Institutions (HEIs). In the contemporary world where quality and speed of services provided are mostly valued, certain higher education centers in Armenia though do not demonstrate enough flexibility in terms of the revision and amendment of courses taught. This issue is present for various curricula at the university level and Translation Studies related curriculum, in particular. Technological innovations that are of great help for translators have been long ago smoothly implemented into the global Translation Industry. According to the European Master's in Translation (EMT) framework, translation service provision comprises linguistic, intercultural, information mining, thematic, and technological competencies. Therefore, to form the competencies mentioned above, the curriculum should be seriously restructured to meet the modern education and job market requirements, relevant courses should be proposed. New courses, in particular, should focus on the formation of technological competences. These suggestions have been made upon the author’s research of the problem across various HEIs in Armenia. The updated curricula should include courses aimed at familiarization with various computer-assisted translation (CAT) tools (MemoQ, Trados, OmegaT, Wordfast, etc.) in the translation process, creation of glossaries and termbases compatible with different platforms), which will ensure consistency in translation of similar texts and speeding up the translation process itself. Another aspect that may be strengthened via curriculum modification is the introduction of interdisciplinary and Project-Based Learning courses, which will enable info mining and thematic competences, which are of great importance as well. Of course, the amendment of the existing curriculum with the mentioned courses will require corresponding faculty development via training, workshops, and seminars. Finally, the provision of extensive internship with translation agencies is strongly recommended as it will ensure the synthesis of theoretical background and practical skills highly required for the specific area. Summing up, restructuring and modernization of the existing curricula on Translation Studies should focus on three major aspects, i.e., introduction of new courses that meet the global quality standards of education, professional development for faculty, and integration of extensive internship supervised by experts in the field.Keywords: competencies, curriculum, modernization, technical literacy, translation studies
Procedia PDF Downloads 131628 Nano-Plasmonic Diagnostic Sensor Using Ultraflat Single-Crystalline Au Nanoplate and Cysteine-Tagged Protein G
Authors: Hwang Ahreum, Kang Taejoon, Kim Bongsoo
Abstract:
Nanosensors for high sensitive detection of diseases have been widely studied to improve the quality of life. Here, we suggest robust nano-plasmonic diagnostic sensor using cysteine tagged protein G (Cys3-protein G) and ultraflat, ultraclean and single-crystalline Au nanoplates. Protein G formed on an ultraflat Au surface provides ideal background for dense and uniform immobilization of antibodies. The Au is highly stable in diverse biochemical environment and can immobilize antibodies easily through Au-S bonding, having been widely used for various biosensing applications. Especially, atomically smooth single-crystalline Au nanomaterials synthesized using chemical vapor transport (CVT) method are very suitable to fabricate reproducible sensitive sensors. As the C-reactive protein (CRP) is a nonspecific biomarker of inflammation and infection, it can be used as a predictive or prognostic marker for various cardiovascular diseases. Cys3-protein G immobilized uniformly on the Au nanoplate enable CRP antibody (anti-CRP) to be ordered in a correct orientation, making their binding capacity be maximized for CRP detection. Immobilization condition for the Cys3-protein G and anti-CRP on the Au nanoplate is optimized visually by AFM analysis. Au nanoparticle - Au nanoplate (NPs-on-Au nanoplate) assembly fabricated from sandwich immunoassay for CRP can reduce zero-signal extremely caused by nonspecific bindings, providing a distinct surface-enhanced Raman scattering (SERS) enhancement still in 10-18 M of CRP concentration. Moreover, the NP-on-Au nanoplate sensor shows an excellent selectivity against non-target proteins with high concentration. In addition, comparing with control experiments employing a Au film fabricated by e-beam assisted deposition and linker molecule, we validate clearly contribution of the Au nanoplate for the attomolar sensitive detection of CRP. We expect that the devised platform employing the complex of single-crystalline Au nanoplates and Cys3-protein G can be applied for detection of many other cancer biomarkers.Keywords: Au nanoplate, biomarker, diagnostic sensor, protein G, SERS
Procedia PDF Downloads 259627 Advanced Compound Coating for Delaying Corrosion of Fast-Dissolving Alloy in High Temperature and Corrosive Environment
Authors: Lei Zhao, Yi Song, Tim Dunne, Jiaxiang (Jason) Ren, Wenhan Yue, Lei Yang, Li Wen, Yu Liu
Abstract:
Fasting dissolving magnesium (DM) alloy technology has contributed significantly to the “Shale Revolution” in oil and gas industry. This application requires DM downhole tools dissolving initially at a slow rate, rapidly accelerating to a high rate after certain period of operation time (typically 8 h to 2 days), a contradicting requirement that can hardly be addressed by traditional Mg alloying or processing itself. Premature disintegration has been broadly reported in downhole DM tool from field trials. To address this issue, “temporary” thin polymers of various formulations are currently coated onto DM surface to delay its initial dissolving. Due to conveying parts, harsh downhole condition, and high dissolving rate of the base material, the current delay coatings relying on pure polymers are found to perform well only at low temperature (typical < 100 ℃) and parts without sharp edges or corners, as severe geometries prevent high quality thin film coatings from forming effectively. In this study, a coating technology combining Plasma Electrolytic Oxide (PEO) coatings with advanced thin film deposition has been developed, which can delay DM complex parts (with sharp corners) in corrosive fluid at 150 ℃ for over 2 days. Synergistic effects between porous hard PEO coating and chemical inert elastic-polymer sealing leads to its delaying dissolution improvement, and strong chemical/physical bonding between these two layers has been found to play essential role. Microstructure of this advanced coating and compatibility between PEO and various polymer selections has been thoroughly investigated and a model is also proposed to explain its delaying performance. This study could not only benefit oil and gas industry to unplug their High Temperature High Pressure (HTHP) unconventional resources inaccessible before, but also potentially provides a technical route for other industries (e.g., bio-medical, automobile, aerospace) where primer anti-corrosive protection on light Mg alloy is highly demanded.Keywords: dissolvable magnesium, coating, plasma electrolytic oxide, sealer
Procedia PDF Downloads 112626 Trial Version of a Systematic Material Selection Tool in Building Element Design
Authors: Mine Koyaz, M. Cem Altun
Abstract:
Selection of the materials satisfying the expected performances is significantly important for any design. Today, with the constantly evolving and developing technologies, the material options are so wide that the necessity of the use of some support tools in the selection process is arising. Therefore, as a sub process of building element design, a systematic material selection tool is developed, that defines four main steps of the material selection; definition, research, comparison and decision. The main purpose of the tool is being an educational instrument that would show a methodic way of material selection in architectural detailing for the use of architecture students. The tool predefines the possible uses of various material databases and other sources of information on material properties. Hence, it is to be used as a guidance for designers, especially with a limited material knowledge and experience. The material selection tool not only embraces technical properties of materials related with building elements’ functional requirements, but also its sensual properties related with the identity of design and its environmental impacts with respect to the sustainability of the design. The method followed in the development of the tool has two main sections; first the examination and application of the existing methods and second the development of trial versions and their applications. Within the scope of the existing methods; design support tools, methodic approaches for the building element design and material selection process, material properties, material databases, methodic approaches for the decision making process are examined. The existing methods are applied by architecture students and newly graduate architects through different design problems. With respect to the results of these applications, strong and weak sides of the existing material selection tools are presented. A main flow chart of the material selection tool has been developed with the objective to apply the strong aspects of the existing methods and develop their weak sides. Through different stages, a different aspect of the material selection process is investigated and the tool took its final form. Systematic material selection tool, within the building element design process, guides the users with a minimum background information, to practically and accurately determine the ideal material that is to be chosen, satisfying the needs of their design. The tool has a flexible structure that answers different needs of different designs and designers. The trial version issued in this paper shows one of the paths that could be followed and illustrates its application over a design problem.Keywords: architectural education, building element design, material selection tool, systematic approach
Procedia PDF Downloads 353625 An Exploratory Study to Appraise the Current Challenges and Limitations Faced in Applying and Integrating the Historic Building Information Modelling Concept for the Management of Historic Buildings
Authors: Oluwatosin Adewale
Abstract:
The sustainability of built heritage has become a relevant issue in recent years due to the social and economic values associated with these buildings. Heritage buildings provide a means for human perception of culture and represent a legacy of long-existing history; they define the local character of the social world and provide a vital connection to the past with their associated aesthetical and communal benefits. The identified values of heritage buildings have increased the importance of conservation and the lifecycle management of these buildings. The recent developments of digital design technology in engineering and the built environment have led to the adoption of Building Information Modelling (BIM) by the Architecture, Engineering, Construction, and Operations (AECO) industry. BIM provides a platform for the lifecycle management of a construction project through effective collaboration among stakeholders and the analysis of a digital information model. This growth in digital design technology has also made its way into the field of architectural heritage management in the form of Historic Building Information Modelling (HBIM). A reverse engineering process for digital documentation of heritage assets that draws upon similar information management processes as the BIM process. However, despite the several scientific and technical contributions made to the development of the HBIM process, it doesn't remain easy to integrate at the most practical level of heritage asset management. The main objective identified under the scope of the study is to review the limitations and challenges faced by heritage management professionals in adopting an HBIM-based asset management procedure for historic building projects. This paper uses an exploratory study in the form of semi-structured interviews to investigate the research problem. A purposive sample of heritage industry experts and professionals were selected to take part in a semi-structured interview to appraise some of the limitations and challenges they have faced with the integration of HBIM into their project workflows. The findings from this study will present the challenges and limitations faced in applying and integrating the HBIM concept for the management of historic buildings.Keywords: building information modelling, built heritage, heritage asset management, historic building information modelling, lifecycle management
Procedia PDF Downloads 105624 Dynamic Web-Based 2D Medical Image Visualization and Processing Software
Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail
Abstract:
In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN
Procedia PDF Downloads 163623 Application of the Building Information Modeling Planning Approach to the Factory Planning
Authors: Peggy Näser
Abstract:
Factory planning is a systematic, objective-oriented process for planning a factory, structured into a sequence of phases, each of which is dependent on the preceding phase and makes use of particular methods and tools, and extending from the setting of objectives to the start of production. The digital factory, on the other hand, is the generic term for a comprehensive network of digital models, methods, and tools – including simulation and 3D visualisation – integrated by a continuous data management system. Its aim is the holistic planning, evaluation and ongoing improvement of all the main structures, processes and resources of the real factory in conjunction with the product. Digital factory planning has already become established in factory planning. The application of Building Information Modeling has not yet been established in factory planning but has been used predominantly in the planning of public buildings. Furthermore, this concept is limited to the planning of the buildings and does not include the planning of equipment of the factory (machines, technical equipment) and their interfaces to the building. BIM is a cooperative method of working, in which the information and data relevant to its lifecycle are consistently recorded, managed and exchanged in a transparent communication between the involved parties on the basis of digital models of a building. Both approaches, the planning approach of Building Information Modeling and the methodical approach of the Digital Factory, are based on the use of a comprehensive data model. Therefore it is necessary to examine how the approach of Building Information Modeling can be extended in the context of factory planning in such a way that an integration of the equipment planning, as well as the building planning, can take place in a common digital model. For this, a number of different perspectives have to be investigated: the equipment perspective including the tools used to implement a comprehensive digital planning process, the communication perspective between the planners of different fields, the legal perspective, that the legal certainty in each country and the quality perspective, on which the quality criteria are defined and the planning will be evaluated. The individual perspectives are examined and illustrated in the article. An approach model for the integration of factory planning into the BIM approach, in particular for the integrated planning of equipment and buildings and the continuous digital planning is developed. For this purpose, the individual factory planning phases are detailed in the sense of the integration of the BIM approach. A comprehensive software concept is shown on the tool. In addition, the prerequisites required for this integrated planning are presented. With the help of the newly developed approach, a better coordination between equipment and buildings is to be achieved, the continuity of the digital factory planning is improved, the data quality is improved and expensive implementation errors are avoided in the implementation.Keywords: building information modeling, digital factory, digital planning, factory planning
Procedia PDF Downloads 270622 Analysis of Constraints and Opportunities in Dairy Production in Botswana
Authors: Som Pal Baliyan
Abstract:
Dairy enterprise has been a major source of employment and income generation in most of the economies worldwide. Botswana government has also identified dairy as one of the agricultural sectors towards diversification of the mineral dependent economy of the country. The huge gap between local demand and supply of milk and milk products indicated that there are not only constraints but also; opportunities exist in this sub sector of agriculture. Therefore, this study was an attempt to identify constraints and opportunities in dairy production industry in Botswana. The possible ways to mitigate the constraints were also identified. The findings should assist the stakeholders especially, policy makers in the formulation of effective policies for the growth of dairy sector in the country. This quantitative study adopted a survey research design. A final survey followed by a pilot survey was conducted for data collection. The purpose of the pilot survey was to collect basic information on the nature and extent of the constraints, opportunities and ways to mitigate the constraints in dairy production. Based on the information from pilot survey, a four point Likert’s scale type questionnaire was constructed, validated and tested for its reliability. The data for the final survey were collected from purposively selected twenty five dairy farms. The descriptive statistical tools were employed to analyze data. Among the twelve constraints identified; high feed costs, feed shortage and availability, lack of technical support, lack of skilled manpower, high prevalence of pests and diseases and, lack of dairy related technologies were the six major constraints in dairy production. Grain feed production, roughage feed production, manufacturing of dairy feed, establishment of milk processing industry and, development of transportation systems were the five major opportunities among the eight opportunities identified. Increasing production of animal feed locally, increasing roughage feed production locally, provision of subsidy on animal feed, easy access to sufficient financial support, training of the farmers and, effective control of pests and diseases were identified as the six major ways to mitigate the constraints. It was recommended that the identified constraints and opportunities as well as the ways to mitigate the constraints need to be carefully considered by the stakeholders especially, policy makers during the formulation and implementation of the policies for the development of dairy sector in Botswana.Keywords: dairy enterprise, milk production, opportunities, production constraints
Procedia PDF Downloads 409621 Chinese Students’ Use of Corpus Tools in an English for Academic Purposes Writing Course: Influence on Learning Behaviour, Performance Outcomes and Perceptions
Authors: Jingwen Ou
Abstract:
Writing for academic purposes in a second or foreign language poses a significant challenge for non-native speakers, particularly at the tertiary level, where English academic writing for L2 students is often hindered by difficulties in academic discourse, including vocabulary, academic register, and organization. The past two decades have witnessed a rising popularity in the application of the data-driven learning (DDL) approach in EAP writing instruction. In light of such a trend, this study aims to enhance the integration of DDL into English for academic purposes (EAP) writing classrooms by investigating the perception of Chinese college students regarding the use of corpus tools for improving EAP writing. Additionally, the research explores their corpus consultation behaviors during training to provide insights into corpus-assisted EAP instruction for DDL practitioners. Given the uprising popularity of DDL, this research aims to investigate Chinese university students’ use of corpus tools with three main foci: 1) the influence of corpus tools on learning behaviours, 2) the influence of corpus tools on students’ academic writing performance outcomes, and 3) students’ perceptions and potential perceptional changes towards the use of such tools. Three corpus tools, CQPWeb, Sketch Engine, and LancsBox X, are selected for investigation due to the scarcity of empirical research on patterns of learners’ engagement with a combination of multiple corpora. The research adopts a pre-test / post-test design for the evaluation of students’ academic writing performance before and after the intervention. Twenty participants will be divided into two groups: an intervention and a non-intervention group. Three corpus training workshops will be delivered at the beginning, middle, and end of a semester. An online survey and three separate focus group interviews are designed to investigate students’ perceptions of the use of corpus tools for improving academic writing skills, particularly the rhetorical functions in different essay sections. Insights from students’ consultation sessions indicated difficulties with DDL practice, including insufficiency of time to complete all tasks, struggle with technical set-up, unfamiliarity with the DDL approach and difficulty with some advanced corpus functions. Findings from the main study aim to provide pedagogical insights and training resources for EAP practitioners and learners.Keywords: corpus linguistics, data-driven learning, English for academic purposes, tertiary education in China
Procedia PDF Downloads 64