What advancements is Linux developing?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RyanLM

Member
May 15, 2003
43
0
0
So Windows.Forms on Windows implements all of the widgets from scratch instead of using what MS already had done? So you'd rather the Mono people do the same and ditch GTK and come up with a new widget set and use straight xlib calls to avoid being a kludge?

The Kludge part come in when there is a constantly changing direction, no clear path to travel in furture releases. Also, because MS controls both items, .Net Framework, GDI+, and Avalon - they will be developed and extended together as a single connected project. Not totally separate items that never dreamed to be working together.

So Windows.Forms on Windows implements all of the widgets from scratch instead of using what MS already had done? So you'd rather the Mono people do the same and ditch GTK and come up with a new widget set and use straight xlib calls to avoid being a kludge?

I really dont have a recommendation what the Mono team should do, I dont know what they can to that would mitigate the issues discribed above. Which is part of my original problem with the posters comment of linux being "the" .Net platform. It might be a decent implementation of the framework, a rather slow one, but as far as UI goes it is anyones guess.

Not magical, it wouldn't be hard to have the class definations in a file that VS.NET loads or even just parse the header files to get the attributes that popup in the code editor. The changes to existing classes should be small since they want to maintain compatibility and new ones should be easy to pickup via things like control registration. But they probably don't want to make VS any bigger and slower than it is already. IIRC the only difference between VS 2002 and 2003 was support for .NET 1.2 and the cheap upgrade period only lasted like a month.

Visual Studio does far more than just that. You sound like someone who has never used it (Espeically since you consider it "slow", it opens as fast as wordpad on my box for crying out loud) It is an integrated development enviroment, not just a compiler. Infact, as a comiler it can do what you said, but that isnt where its value comes in. It ties together your Database, Webserver, other COM+ applications to all work and develope as one. It has designers that give you the best WYSIWYG editing of just about any type of object, from webpages, buttons, to databases.

The main jump between the last two versions was .Net 1.0 to 1.1. The IDE was now able to target a specific runtime if needed, added support for new 1.1 features and automation, and was actually made even faster believe it or not. However it contained other major advancements - the ability to target smart devices, such as PocketPCs and Phones, and also emulate and debug them in realtime with out even needing a device, better integration with Server 2003 and fun things like EIF.

And again, it isnt exaclty "Expensive" http://www.microsoft.com/products/info/product.aspx?view=22&pcid=a51c55a7-bae8-4d91-a837-b2fc472ff65c&type=ovr

Looking forward to what 2005 brings is also amazing - have a look for youself.
http://lab.msdn.microsoft.com/vs2005/

VS.NET isn't bad but it's big, slow and the MSDN search tool sucks ass

It is big, it is NOT slow in the least - and the MSDN documention is incredible. Do you know that as you are coding, you can double click a any item in your code (be it a a class, object, Enum, or some bizzare hex) and it will give you live ACCURATE help (Methods, props, hotos) on that item? You can also extend that documentation AS YOU TYPE with specific comments? I have seen nothing better.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
The Kludge part come in when there is a constantly changing direction, no clear path to travel in furture releases. Also, because MS controls both items, .Net Framework, GDI+, and Avalon - they will be developed and extended together as a single connected project. Not totally separate items that never dreamed to be working together.

So it's a kludge because it's not controlled by MS afterall, go figure. I thought one of the big things about .NET was the fact that it was a published standard, lucky for them they left out a lot of the important things like Windows.Forms so other people have to kludge them together if they want any hope of running anything besides HelloWorld.exe. And I don't know if I would call one change constant, but I guess if you need to spin it a certain way...

You sound like someone who has never used it (Espeically since you consider it "slow", it opens as fast as wordpad on my box for crying out loud) It is an integrated development enviroment, not just a compiler

I know what it is, I have used it although not much since I'm not a professional developer. And while I hate to admit it, VS is one of the good products to come out of Redmond. Sadly this version is noticably slower than the older ones. And while I did run it inside VMWare, I run a lot of things in that VMWare session and VS.NET 2001 and 2002 were both the slowest without a doubt.

And again, it isnt exaclty "Expensive"

Ummm, Visual C#.NET isn't exactly Visual Studio. And the full Visual Studio is quite a bit more expensive: http://www.microsoft.com/PRODUCTS/info/product.aspx?view=22&pcid=9fdcc2af-6b86-4ee8-9b71-90cebe8626e6&type=ovr

It is big, it is NOT slow in the least - and the MSDN documention is incredible

It is big, it is slow and while the docs are complete they're incredibly hard to navigate. I have less trouble finding documentation in man pages, you can't even put a function name into the search box and get accurate results because of the stupid search engine. Sure you can type it into the code editor and hit F1 but that's a stupid work around.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Well this my take on it.

It's my understanding that Window.forms/.Net is somehting that is completely different from Avalon. Avalon is a new API, like Win32 vs .NET.

With .NET and Mono, eventually there will be no such thing as programming a native Windows program with .NET, like their is no such thing as programming a Linux specific Python program, or supposadly Java. Just as long as your smart about it. (For example in Python I could use hard coded path names like /usr/local/bin for different things, OR I can use the OS (and others) module and use system variables that would make my scripts work equally as well in Linux as in Windows or OS X or any platform that Python runs natively on.)

So then if apps run in Linux just as well, (or at least nearly as well), as they do in Windows then that will remove a major hurdle to the widespread adoption of Linux.

If Linux became 10-20% of the market, then if you want your apps to reach that PLUS the 80% or so that run Windows, then the only real choice in developement for many people will default to .NET rather then the Windows-only setups that you'd get by simply sticking to the Avalon API. Hopefully this sort of thing will negate the desire of MS to isolate programmers and developers to only one platform. Apps programmed using the Avalon API will never have a chance of working properly in Linux. Wine is great, it's not perfect, but most apps work just fine. But there are not only technical reasons why Avalon will probably never have a Wine-type API for Linux, but legal ones such as patents and the DMCA.

Combined with the other positive attributes (Mono is one part of many things people are working on) of the Linux platform and free software in general may create a situation were you get a chance to finally break the Windows monopoly of the desktop and give people a real choice between running whatever OS they'd like.
 

RyanLM

Member
May 15, 2003
43
0
0
So it's a kludge because it's not controlled by MS afterall, go figure.

In a way, yes. All of the assemblies and classes will continue to have a consistant structure and also a very clear roadmap for the future. Things you can build a long term application on. The last thing we need is 30 flavors of a standard API.

I thought one of the big things about .NET was the fact that it was a published standard, lucky for them they left out a lot of the important things like Windows.Forms so other people have to kludge them together if they want any hope of running anything besides HelloWorld.exe.

It is one of the selling points of .Net in that the runtime can be ported, however I dont think that MS had much choice in leaving out the WinForm part of the framework. Have you seen Java's cross platform UI? Terrible, Ugly, and horribly slow. Considering the ammount of extra time Sun has had to work on this, and it still sucks. To me it is the slow part as to why they just leveraged GDI+ for .Nets UI on Windows.

And I don't know if I would call one change constant, but I guess if you need to spin it a certain way...

It is not spin it is a fact, it is also something you cant build any long term enterprise application on. You need to know that things will be supported, and will not change on a whim.

Sadly this version is noticably slower than the older ones. And while I did run it inside VMWare, I run a lot of things in that VMWare session and VS.NET 2001 and 2002 were both the slowest without a doubt.

I know that one change as far as speed (opening the app) came in 2003. They used to call IE to draw the HTML intefrace VS opened with (for managing projects and such) which caused some extra load time (about as much as it takes to load IE in widows, so a second or two) Now, they do the redering without IE, so it saved that second or to, so your looking at 3 seconds to load VS 2003 - Not slow at all.

Ummm, Visual C#.NET isn't exactly Visual Studio. And the full Visual Studio is quite a bit more expensive:

It IS Visual Studio, with the limitation of it compiles C# only. The standard edition of VS also doesnt include some enterprise tools (like the enterprise stress tester) and visio. However, I wouldnt call it a "striped down product" you can code full applications with the vast majority in functionality in VS.net.

I cant think of a project I have done in the last year (other than a few J# projects) that would have required something other than single language standard.

It is big, it is slow

It is big, but the vast majority of the install is the docs, SDKs, and Samples it installs. The IDE is only 24 Meg, hardly a large program for what it does. And, can you give me some examples why you think it is slow? It loads in about 3 seconds, there is NO lag at all when using the program, excpet in compiling which say on a 50,000 line project is about what, 5-8 seconds? You find this slow?

while the docs are complete they're incredibly hard to navigate.

Why? You can search, you can use the Tree View, or you can let VS.Net find it for you - how in the WORLD is this "Hard to navigate" You do realize that the PC local version has a much easier searching interface than the one on msdn.microsoft.com? I guess I dont see how else they were supposed to do it. The TreeView of online articles, references, and books is very logically laid out, and its fully searchable.

I have less trouble finding documentation in man pages, you can't even put a function name into the search box and get accurate results because of the stupid search engine.

Umm, I do it every day. If you dont know how to search it you will get back 100s of results (just like in google) It is about 2 gig of data. If you search for a function name you are going to get back every where it was used, just as in google. The docs have help for what 10 differnet languages, and many of them overlap function names. If you simply add a filter term such as C#, you get back exactly what your looking for. They also weight Refrence material higher than docs, so you should get your main items right on top.

Sure you can type it into the code editor and hit F1 but that's a stupid work around.

That is not how it works at all. Basically, as you type it is changing a Help Window (generally docked with the properties window) based on what you have typed. For instance, if I type "if(" it would show me help "Using if statments" along with help on what type of Operators I can use (given my current language). Alternatively, you can double click any class or part of the framework, say "System.Drawing.Drawing2d.LinearGradientBrush" (just click LinearGradientBrush) and you will get on the right links directly too the class reference along with any number of example on how to use it. I guess my point is, you dont need to do anything to get help, it is displayed for you. This feature is also wickedly fast, I am really curious how they do it.
 

RyanLM

Member
May 15, 2003
43
0
0
It's my understanding that Window.forms/.Net is somehting that is completely different from Avalon. Avalon is a new API, like Win32 vs .NET.

Avalon is exposed via .Net - I am not sure if it is written in .Net, but the only way you can access it is via .Net. System.blaa.class

It is separate from WinForms, however you can use both, because they do different things.

With .NET and Mono, eventually there will be no such thing as programming a native Windows program with .NET, like their is no such thing as programming a Linux specific Python program, or supposadly Java. Just as long as your smart about it. (For example in Python I could use hard coded path names like /usr/local/bin for different things, OR I can use the OS (and others) module and use system variables that would make my scripts work equally as well in Linux as in Windows or OS X or any platform that Python runs natively on.)

That is a good goal to shoot for, and already possible as long as you dont go near the GUI. However, there are alot of times when you dont just want a program to "think" infact, if your not making a website, most of the time your program needs to interface with something else. Most recently, coded an app that made use of a scanning system, and another a video capture device. Both were very native to windows, however only took days to complete because of the infrastructure already there in dealing with these devices.

I agree that eventually, there may be no such thing as directly programing for windows. On the server side we are pretty much there, but for end user appliations? We are well over 10 years away from that point, if you want to get beyond the basic apps.

So then if apps run in Linux just as well, (or at least nearly as well), as they do in Windows then that will remove a major hurdle to the widespread adoption of Linux.

I agree, but this isnt going to happen any time soon.

If Linux became 10-20% of the market, then if you want your apps to reach that PLUS the 80% or so that run Windows, then the only real choice in developement for many people will default to .NET rather then the Windows-only setups that you'd get by simply sticking to the Avalon API.

It is a chicken vs egg scenario. Yes, if linux commanded a 20% market share, you would see some people releasing apps for the platform, however I doubt they would use an intermediate language like .Net or Java to do so, because .Net or Java still doesnt fix the major issues when making a cross platform app - Interacting with the System and external components.

Combined with the other positive attributes (Mono is one part of many things people are working on) of the Linux platform and free software in general may create a situation were you get a chance to finally break the Windows monopoly of the desktop and give people a real choice between running whatever OS they'd like.

The question I would raise is this, is it a monopoly that is here because of MS, or is it here because people made a concious choice to use windows. I think that as years go by, the opinion is shifting from MS to people actually like using Windows. Windows today works, as in stabiltiy, and as in I can walk into any computer store blindfolded, buy something, plug it in and it works. It doenst require the user to do any complexity, and IMHO in the corporate world blows any other solution out of the water for end user desktops and managibilty. Linux can be made to do alot of the things windows does in a corporate network, however it is generally "Jim's Solution" - not the "Standard" solution any one of 1000 techs can come in, administrate, understand, and fix. It just hanst reached that level of maturity yet, I think it will get there in 5-10 years, maybe. But where do you think MS will be by then?
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
In a way, yes. All of the assemblies and classes will continue to have a consistant structure and also a very clear roadmap for the future. Things you can build a long term application on. The last thing we need is 30 flavors of a standard API.

Huh? Windows.Forms is the API, Mono is just implementing it. If the API gets any new flavors they'll most likely be from MS.

It is one of the selling points of .Net in that the runtime can be ported, however I dont think that MS had much choice in leaving out the WinForm part of the framework. Have you seen Java's cross platform UI? Terrible, Ugly, and horribly slow. Considering the ammount of extra time Sun has had to work on this, and it still sucks. To me it is the slow part as to why they just leveraged GDI+ for .Nets UI on Windows.

But they could have made the WinForms API part of the standard and just did their implementation in GDI+ like Mono is doing with WinForms and GTK.

It is not spin it is a fact, it is also something you cant build any long term enterprise application on. You need to know that things will be supported, and will not change on a whim.

Right, because MS never changes their mind about anything, delays products, etc.

I cant think of a project I have done in the last year (other than a few J# projects) that would have required something other than single language standard.

I doubt you're among the majority, most in-house business apps are done in VB and while it would be nice to migrate them away from VB it probbaly won't happen as long as MS sells a version of VB.

It loads in about 3 seconds,

It's closer to 10s here with no project or project manager interface thing. Creating a new, blank windows project takes about another 10s. Switching filters in the MSDN doc viewer takes about 3s. Obvioysly once it's loaded once and everything is in the filesystem cache it loads much faster.

excpet in compiling which say on a 50,000 line project is about what, 5-8 seconds? You find this slow?

I don't have a 50,000 line project to time but since compiling one of the winforms samples takes 2-3s I would say your 8s is optimistic.

Why? You can search, you can use the Tree View, or you can let VS.Net find it for you - how in the WORLD is this "Hard to navigate" You do realize that the PC local version has a much easier searching interface than the one on msdn.microsoft.com? I guess I dont see how else they were supposed to do it. The TreeView of online articles, references, and books is very logically laid out, and its fully searchable.

The search is ass, like I said even when I typed a function name in directly I got like 500 results and most of them were irrelevant. And I generally end up using google to search msdn.microsoft.com since it's much more accurate.
 

RyanLM

Member
May 15, 2003
43
0
0
But they could have made the WinForms API part of the standard and just did their implementation in GDI+ like Mono is doing with WinForms and GTK.

I would argue that the WinForms is currently the standard, as it is fully documented. GDI+ not used to draw windows, buttons, widgets, etc - it is used to paint your controls and draw primatives. The work together but do very separate things in the framework. WinForms is really a wrapper if you will around the windows system, as far as managing windows. I dont see how you are going to directly port that. I could see porting a GDI+ framework, as it is basics like "DrawLine();

Right, because MS never changes their mind about anything, delays products, etc.

Developers, Developers, Developers!!! You havent heard the mantra? Hell, Longhorn still runs VisiCalc. Do they change their mind? Sure, everyone does. But, they support their past.

I doubt you're among the majority, most in-house business apps are done in VB and while it would be nice to migrate them away from VB it probbaly won't happen as long as MS sells a version of VB.

What I was trying to say that the need for the full blown VS.net isnt important to most people, most people do VB OR C#. You wouldnt need a comipler that every language if you dont know them. If your business uses C# buy that version, if it uses VB buy that one, etc.

It's closer to 10s here with no project or project manager interface thing. Creating a new, blank windows project takes about another 10s. Switching filters in the MSDN doc viewer takes about 3s. Obvioysly once it's loaded once and everything is in the filesystem cache it loads much faster.

On my system it is damn near instant. But, even taking your system, thats SLOW? For as advanced as a product it is, taken 10 seconds to load up all of the plugins and features? Taking 10 seconds to create a project (usually about 10-15 files) link assemblies, and do a basic compile is "slow"? How about this, do it all by hand and tell me which is faster Better yet, take another product, and see how good VS really is.

I don't have a 50,000 line project to time but since compiling one of the winforms samples takes 2-3s I would say your 8s is optimistic.

I would say that the first compile may take a bit longer, mainly because it has to read all of the files into memory first, however no compiler is going to get arouund that, but still, on my box (P4 3.0 Ghz) it is wicked fast.

The search is ass, like I said even when I typed a function name in directly I got like 500 results and most of them were irrelevant. And I generally end up using google to search msdn.microsoft.com since it's much more accurate.

All I can say is read the help on how to search
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Originally posted by: RyanLM
It's my understanding that Window.forms/.Net is somehting that is completely different from Avalon. Avalon is a new API, like Win32 vs .NET.

Avalon is exposed via .Net - I am not sure if it is written in .Net, but the only way you can access it is via .Net. System.blaa.class

It is separate from WinForms, however you can use both, because they do different things.

With .NET and Mono, eventually there will be no such thing as programming a native Windows program with .NET, like their is no such thing as programming a Linux specific Python program, or supposadly Java. Just as long as your smart about it. (For example in Python I could use hard coded path names like /usr/local/bin for different things, OR I can use the OS (and others) module and use system variables that would make my scripts work equally as well in Linux as in Windows or OS X or any platform that Python runs natively on.)

That is a good goal to shoot for, and already possible as long as you dont go near the GUI. However, there are alot of times when you dont just want a program to "think" infact, if your not making a website, most of the time your program needs to interface with something else. Most recently, coded an app that made use of a scanning system, and another a video capture device. Both were very native to windows, however only took days to complete because of the infrastructure already there in dealing with these devices.

It still wouldn't be hard to port to Linux. SANE is a perfectly good backend and superior to TWAIN in lots of ways. Of course if your just using some sort of propriatory windows stuff that I am not aware of, then whatever. (you can do stuff with SANE like set up a scanner to be used over a network, and have a TWAIN compatable front end on Windows run it.)

I agree that eventually, there may be no such thing as directly programing for windows. On the server side we are pretty much there, but for end user appliations? We are well over 10 years away from that point, if you want to get beyond the basic apps.

What like something like Apache web server is too complex to have run in Linux and in Windows?

I routeinely run Windows apps in Linux using Wine. Even 3d stuff, as long as they use OpenGL and not directX for their apps (very little slowdown even). Most of them are never realy intended to be portable, but they are using the same docs to program from as those aviable to the Wine developement team.

Programming for portablity is sometimes a pain, and requires more intellegent programmers sometimes. But it's not impossible. It is very common in the Unix world to go the extra step to make portable applications. Most of the time it only needs a make file and your set. People have been doing it for ages and ages. Just stay away from Window's realy propriatory stuff and you'll be set. Many programs will run fine on Linux will run fine on OS X or BSD or Solaris or whatever.

So then if apps run in Linux just as well, (or at least nearly as well), as they do in Windows then that will remove a major hurdle to the widespread adoption of Linux.

I agree, but this isnt going to happen any time soon.

Maybe, maybe not. Is it possible to predict accurately the state of computers 3-5 years from now?

If Linux became 10-20% of the market, then if you want your apps to reach that PLUS the 80% or so that run Windows, then the only real choice in developement for many people will default to .NET rather then the Windows-only setups that you'd get by simply sticking to the Avalon API.

It is a chicken vs egg scenario. Yes, if linux commanded a 20% market share, you would see some people releasing apps for the platform, however I doubt they would use an intermediate language like .Net or Java to do so, because .Net or Java still doesnt fix the major issues when making a cross platform app - Interacting with the System and external components.

The issue is more like MS intentionally making it's products incompatable with everything else out their.

But realy, how much .NET programming is going to purposely dealing withe video card on a low level? How many apps need to interact with the scanner? Is a .NET program going to have to handle input from a keyboard in Windows differently from keyboard input from X windows-using OS?

I think your exagerating this a little bit.

Combined with the other positive attributes (Mono is one part of many things people are working on) of the Linux platform and free software in general may create a situation were you get a chance to finally break the Windows monopoly of the desktop and give people a real choice between running whatever OS they'd like.

The question I would raise is this, is it a monopoly that is here because of MS, or is it here because people made a concious choice to use windows. [/quote]

Its' a monopoly because the situation MS found themselves in, and stupid mistakes that their competators made. (Like IBM choosing to liscence a OS from MS, instead of just buying it)

It couldn't happen again. Bill Gates was the right person to be their at the right time and he was smart enough to seize the possiblity.

So far MS has failed to make profits outside of it's Windows and Office franchises. They are hoping that buy locking people and developers into Windows that they can exclude the rest of the world.

90% of the world doesn't choose Windows, they don't give a damn about what operating system they are using.
[q/]
I think that as years go by, the opinion is shifting from MS to people actually like using Windows.

That's what MS thought, too. Untill they did a study and found out that nobody cares about XP. They use it because it came with their computer.

Windows today works, as in stabiltiy, and as in I can walk into any computer store blindfolded, buy something, plug it in and it works. It doenst require the user to do any complexity, and IMHO in the corporate world blows any other solution out of the water for end user desktops and managibilty.

Ya and you can walk blindfolded into your home with a new computer, plug it into the wall and instantly be overrun by viruses, spyware, worms, and other various things of software suckatude. No other company in existance has yet reached the level of MS security's ineptatude. Nobody.

Linux can be made to do alot of the things windows does in a corporate network, however it is generally "Jim's Solution" - not the "Standard" solution any one of 1000 techs can come in, administrate, understand, and fix.

I personally maintained close to 200 OS X boxes. All these were used by students with little to no technical training, very little prior experiance with Windows or OS X or anything (art students). They had full access to the internet, and were using Administrator accounts by default.

I kept them stable, up to date, patched, and corrected things like students deleting and moving application programs and system files around.

I did this successfully on a part time job. And you know what I spent most of my time doing? Surfing the internet. Not because I was lazy, but because it was just that freaking easy. What was irritating though was that I wasn't realy allowed to do to much scripting or automation. I could of saved myself more work, but then again OS X isn't Linux.

Any cleanup or down time was spent in the 2-4 days that I could come in and work in between quarters.

You know what the system administrators spent most of their time doing? Struggling to keep the W2k servers going. Full time job, that was. In a different building they were experiancing network outages and corrupt files on the w2k server. The network acted like it was completely conjested, but when I hooked up a ancient pentium pro server, and installed debian stable on it, and set up SAMBA (thru webmin, a nice webbased interface), it outperformed W2k servers running 1.8ghz servers with RAID drives.

Turned out to be a misconfigured arp/rarp setup on the w2k server, but it took weeks for the experianced admins to figure it out. Still didn't solve all the problems.

With linux you don't need a 1000 techs to come in and understand and administrate. You need a couple dozen gurus to come in, you pay them twice as much as the windows admins, but they can do 20x the work. Then you have operators that are smart enough to figure stuff out and follow directions. It isn't hard.

(just remember that a majority of the worlds servers run something other then Windows. Any administrator that can run Linux or Unix, can run any Unix-like OS with a little bit of time for adjustment)

It just hanst reached that level of maturity yet, I think it will get there in 5-10 years, maybe. But where do you think MS will be by then?

Hopefully on only 50-70% of desktop machines, instead of 93%

There is nothing that Windows can do that Linux can't. Maybe a little bit harder, but 9 times out of 10 it will be more stable. But there is a lot that Linux can do that Windows can't.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I would argue that the WinForms is currently the standard, as it is fully documented. GDI+ not used to draw windows, buttons, widgets, etc - it is used to paint your controls and draw primatives. The work together but do very separate things in the framework. WinForms is really a wrapper if you will around the windows system, as far as managing windows. I dont see how you are going to directly port that. I could see porting a GDI+ framework, as it is basics like "DrawLine();

If documentation is all it takes to make a standard someone should alert IEEE and the like because we don't really need them any more. Porting GDI+ would be pointless because unix already has xlib for really low level stuff and GTK, QT, etc for the higher level stuff. I'm sure WinForms over GTK will need some internal hacks to work properly since things like the window handles aren't really portable, but saying it can't work is stupid.

Developers, Developers, Developers!!! You havent heard the mantra? Hell, Longhorn still runs VisiCalc. Do they change their mind? Sure, everyone does. But, they support their past.

And supporting software of the past is one of Windows' biggest problem sources. But it's a bad place for MS to be in, to ditch all those hacks and have a better overall product would cause too many people to cry and not buy the software, so MS does all they can do and keep the hacks up to date so they can sell more products. Linux has the same issues but luckily to a much lesser extent because the users are much better at dealing with those changes and in some cases the old software can be brought up to date even if the original developer doesn't want to do it since so much of it is open source.

On my system it is damn near instant. But, even taking your system, thats SLOW? For as advanced as a product it is, taken 10 seconds to load up all of the plugins and features? Taking 10 seconds to create a project (usually about 10-15 files) link assemblies, and do a basic compile is "slow"? How about this, do it all by hand and tell me which is faster Better yet, take another product, and see how good VS really is.

Sadly, I know how good VS is compared to others. But the company I work for does a lot of Java development too, so Together and JBuilder are fairly popular here but they're nowhere near fast either. And I was comparing it's speed to VS 6 which was about twice as fast, IIRC. Luckily I'm not a professional developer and I can afford to use vim, gdb/ddd, 'perl -d', etc when I need to debug something.

I would say that the first compile may take a bit longer, mainly because it has to read all of the files into memory first, however no compiler is going to get arouund that, but still, on my box (P4 3.0 Ghz) it is wicked fast.

The notebook I was running it on is a P4 3Ghz as well.

All I can say is read the help on how to search

Frankly you shouoldn't need to do any reading on how to search, I never had to read docs on how to use google. But I gave up on VS while ago, we have very few copies of VS.NET installed where I work (that and I don't have to support them anymore) so I'm not concerned with it anymore.

It just hanst reached that level of maturity yet, I think it will get there in 5-10 years, maybe. But where do you think MS will be by then?

IMO Windows hasn't reached maturity yet either but that doesn't stop people from using it. If Windows was as mature as people would like to think, there wouldn't be nearly as many people with problems and zombie machines out there with CodeRed. Sure some of the problem is on the users doing the things that get them infected with whatever, but with all the added precautions and lessons needed to use a Windows machine securely these days I would say Linux is simpler to get running in a safe fashion.
 

RyanLM

Member
May 15, 2003
43
0
0
It still wouldn't be hard to port to Linux. SANE is a perfectly good backend and superior to TWAIN in lots of ways. Of course if your just using some sort of propriatory windows stuff that I am not aware of, then whatever. (you can do stuff with SANE like set up a scanner to be used over a network, and have a TWAIN compatable front end on Windows run it.)

Scanning was a Single function that would need to be ported. Avalon would not be simple, neither wold the new Windowing functions coming in Longhorn with timelining and 3d functinality. To make a "Port" work for enterprise, when I call Object.Method it damn well better have the exact same result on any platform to call it self cross platform.

I think this is possible to do, however, it is not "easy" or "quick" to do. By the time Linux implements all of the plumbing currently in windows to do these things, where the hell do you think windows will be?

What like something like Apache web server is too complex to have run in Linux and in Windows?

No, that is a server app, not a GUI app. Something that can run in a console window can be multiplatformed fairly well as long as it's basically "thinking". Such as a web server, file server, distributed computing, etc. It is not like cross platforming photoshop.

Programming for portablity is sometimes a pain, and requires more intellegent programmers sometimes. But it's not impossible. It is very common in the Unix world to go the extra step to make portable applications. Most of the time it only needs a make file and your set. People have been doing it for ages and ages. Just stay away from Window's realy propriatory stuff and you'll be set. Many programs will run fine on Linux will run fine on OS X or BSD or Solaris or whatever.

It may not be impossible, but it isnt always economical, or the best chocie. The reason many of those apps tend to work accross *nix is because they are console based, or because they all use a common GUI infrastructure. You may argue that that should be the standard, however the other 95% of machines out there run another "standard".

Maybe, maybe not. Is it possible to predict accurately the state of computers 3-5 years from now?

Accuratly, never. However you can make educated guesses based on past progress. I remember 199X being the year of the Linux desktop, now its 200X. Linux's beauty is that it really has no one to answer to (for the most part). There really isnt one conforming entity that controls everything from the kernal all the way to the desktop. You can bend it in a million different ways or change it however you want. However, it is also the biggest mark against it as it is NOT a predicatable platform, at least not compared to other OSs on the market.

The issue is more like MS intentionally making it's products incompatable with everything else out their.

I view it as "Its is MS intentionally making things easier for developers to keep its installed base". Sure, you may view it as creating a proprietary standard, they view it as "Devs want his now, lets do it". In many things they release specifications, and in many things they don't. I also dont fault them for this, everything they do cost time and money, I would never expect any company just to give that away.

But realy, how much .NET programming is going to purposely dealing withe video card on a low level? How many apps need to interact with the scanner? Is a .NET program going to have to handle input from a keyboard in Windows differently from keyboard input from X windows-using OS?

Depends on how low level you are talking about. DX9.0 now has managed assemblies, meaning you can write a game in .Net. Vertigo also ported Quake II to .Net (http://www.vertigosoftware.com/Quake2.htm) it runs about 85% as fast (which is very good considering they took all assembly out of it, and it is now memory managed) . However, what if you needed to interoperate with a PocketPC, Palm, Printer, Scanner, or Video capture, etc. The framework today does not directly hook up these needs, it uses windows to do most of this work. Which is why my comments of porting a compelete functinality set to another OS is a VERY daunting task.

90% of the world doesn't choose Windows, they don't give a damn about what operating system they are using.

I dont believe that. The mac has been out their for years, and it is the closest thing to windows as far as support for off the shelf programs and hardware there is to windows. It is still below 2%. If people didnt care, the law of averages would have bumped them up past 10%. I truely believe people like windows, I think in a home enviroment (besides gaming) it doesnt matter as much, but I have people in business networks hugging me 100 Person offices, NO IT staff, Multiple locations all secured and automated in a level I have only seen in widows. People like that bottom line, Sure they had to pay for their up front Licenses, but Tech time adds up much faster than that ever will.

That's what MS thought, too. Untill they did a study and found out that nobody cares about XP. They use it because it came with their computer.

Was that a /. article

Ya and you can walk blindfolded into your home with a new computer, plug it into the wall and instantly be overrun by viruses, spyware, worms, and other various things of software suckatude. No other company in existance has yet reached the level of MS security's ineptatude. Nobody.

Is that MSs fault? These things just dont "Install Themselves" - People have to click "Yes" to a big warning or they are downloading crap of KaZaa. Has there been a security hole that a virus has attacked MS software in the past few years that a patch wasnt readily available to the masses for MONTHS? No, there hasnt been. MS made was also the first to auto update itself, however most simply ignored the feature, they didnt want "Their computer reporting back to Redmond!!". You do realize that the number of flaws found in open source software far surpasses flaws MS software both in quantity and sevearity? Simply put, they are 95% of the market, when something comes out that effects it - its going to hurt more than someone with 1% who 9 times out of 10 has a damn firewall infront of his cable modem anyway.

Just for some info:

By Forrester's counting, the Windows platform (which includes popular programs like Internet Explorer, the SQL Server database, and such) had 126 security flaws in its stack in that year's time, with 67 percent of them being high-severity vulnerabilities (that's 86). Microsoft fixed all 128 flaws in an average of 25 days. Red Hat had 229 flaws, of which 56 percent (128) were high severity flaws. Red Hat fixed 99.6 percent of all flaws during that time, and the average days of risk for the Red Hat platform was 57, with 47 days of distribution risk. (In other words, there was a 10-day lag between a patch being announced for a Linux component by its maintainer and the patch being released by Red Hat as part of its security updates.)

http://www.midrangeserver.com/tlb/tlb041304-story01.html

I personally maintained close to 200 OS X boxes. All these were used by students with little to no technical training, very little prior experiance with Windows or OS X or anything (art students). They had full access to the internet, and were using Administrator accounts by default.

Ok, And? Now, say all of your students decided they want to visit www.xxx.live.com and you were told to block it within 2 mintues or your fired. They also asked 4 other admins to do the same thing, A) could you do it? and B) would you all have done it the same way? Keeping things running and managing are two different things.

In windows, this would have been simple group policy update. Again, I am not denying that these things can be done, I am saying that there doesnt seem to be a single specific way that a new admin could walk in and have a firm grasp on what was going on with only an administrator passoword.

You know what the system administrators spent most of their time doing? Struggling to keep the W2k servers going. Full time job, that was. In

Then you had unqualified admins or damanged equipment. It is as simple as that.

The network acted like it was completely conjested, but when I hooked up a ancient pentium pro server, and installed debian stable on it, and set up SAMBA (thru webmin, a nice webbased interface), it outperformed W2k servers running 1.8ghz servers with RAID drives.

Again, ever think it was just a configuration issue? While not 2000, but 2003 it is interesting - http://www.veritest.com/clients/reports/microsoft/ms_netbench.pdf

File serving should most often be limited by your Nic, but one thing that tends "usually" to hold some of the windows benchmarks back in the SMB area is that windows signs each message going over the wire, where as in most of the benches, they didnt turn that off when they compared it to samba. which, of course is not fair.

Turned out to be a misconfigured arp/rarp setup on the w2k server, but it took weeks for the experianced admins to figure it out. Still didn't solve all the problems.

Ahh, so I was right It isnt broke out of the box, so those "exprianced admins" must have broke it. I am not surprised there are still issues.

With linux you don't need a 1000 techs to come in and understand and administrate. You need a couple dozen gurus to come in, you pay them twice as much as the windows admins, but they can do 20x the work. Then you have operators that are smart enough to figure stuff out and follow directions. It isn't hard.

LOL tell that to Munich. It is costing them many millions more, not going very well to boot, to implement the Open Source solution instead of the MS solution. I would ask you to try and backup your statement with fact or research. As, in just about EVERY research I have read the TCO is still far less in a windows enviroment, including cost of software. FYI the majority of tech time isnt spent in the server room, it is spent at the end user - Tell me again how you roll out policy management to a few thousand desktops in the linux word with a click of a mouse?

There is nothing that Windows can do that Linux can't.

Sure there is, runs a crap load of software, most games, far more hardware and .Net correctly It seems its getting faster too. And, I do you honestly think the stabiltiy card can be played anymore? Seriously?

But there is a lot that Linux can do that Windows can't.

I think there are things on both sides that each cant do, the question is which are more important to the largest number of people?
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: RyanLM
Just for some info:

By Forrester's counting, the Windows platform (which includes popular programs like Internet Explorer, the SQL Server database, and such) had 126 security flaws in its stack in that year's time, with 67 percent of them being high-severity vulnerabilities (that's 86). Microsoft fixed all 128 flaws in an average of 25 days. Red Hat had 229 flaws, of which 56 percent (128) were high severity flaws. Red Hat fixed 99.6 percent of all flaws during that time, and the average days of risk for the Red Hat platform was 57, with 47 days of distribution risk. (In other words, there was a 10-day lag between a patch being announced for a Linux component by its maintainer and the patch being released by Red Hat as part of its security updates.)

http://www.midrangeserver.com/tlb/tlb041304-story01.html

I thought you were doing pretty well until you included that. It's a bunch of bullshit, pure and simple. Either come up with a source that uses a little bit of logic in their studies or don't post crap. Thanks.
 

RyanLM

Member
May 15, 2003
43
0
0
If documentation is all it takes to make a standard someone should alert IEEE and the like because we don't really need them any more. Porting GDI+ would be pointless because unix already has xlib for really low level stuff and GTK, QT, etc for the higher level stuff. I'm sure WinForms over GTK will need some internal hacks to work properly since things like the window handles aren't really portable, but saying it can't work is stupid.

I just dont see how it would work 100% the same 100% of the time, which is required in this case. I dont really want some low level "hacks" in my code either. But, I will say I would RATHER be wrong in this case, I would love to have a true cross platform gui framework that didnt suck But, it hasnt been done well yet for a reason.

And supporting software of the past is one of Windows' biggest problem sources. But it's a bad place for MS to be in, to ditch all those hacks and have a better overall product would cause too many people to cry and not buy the software, so MS does all they can do and keep the hacks up to date so they can sell more products. Linux has the same issues but luckily to a much lesser extent because the users are much better at dealing with those changes and in some cases the old software can be brought up to date even if the original developer doesn't want to do it since so much of it is open source.

MS has done a good job of componentizing past version of windows. And, it works, and doenst seem to cause the stabiltiy issues of past releases. In XP you are aware that you can tell any program to run as if it was in 95. It is part of WoW. But in a place where everything is open source you are right, this is less of an issue, but rarely a bad thing.

The notebook I was running it on is a P4 3Ghz as well.

It was most likely an I/O issue then. VS does a lot of version checks on assemblies and such.

Frankly you shouoldn't need to do any reading on how to search, I never had to read docs on how to use google. But I gave up on VS while ago, we have very few copies of VS.NET installed where I work (that and I don't have to support them anymore) so I'm not concerned with it anymore.

Fair enough.

IMO Windows hasn't reached maturity yet either but that doesn't stop people from using it. If Windows was as mature as people would like to think, there wouldn't be nearly as many people with problems and zombie machines out there with CodeRed. Sure some of the problem is on the users doing the things that get them infected with whatever, but with all the added precautions and lessons needed to use a Windows machine securely these days I would say Linux is simpler to get running in a safe fashion.

I disagree There are plenty of flaws out on both platforms, more so on Linux (as shown in previous post) but MS has the best track record on fixing them, and ease of finding/downloading/installing them. Now with SP2, the firewall is on by default, which will stop the vast majority of MS attacks.
 

RyanLM

Member
May 15, 2003
43
0
0
I thought you were doing pretty well until you included that. It's a bunch of bullshit, pure and simple. Either come up with a source that uses a little bit of logic in their studies or don't post crap. Thanks

What is crap? I have seen a few reports that basically say the same things, is their data invalid? What is wrong with their conclusions? I would agree that this metric isnt the best to determine the overall security of the products, when tallking in the context of "Holes" and "Viruses" I think the findings fit rather well.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: RyanLM
I thought you were doing pretty well until you included that. It's a bunch of bullshit, pure and simple. Either come up with a source that uses a little bit of logic in their studies or don't post crap. Thanks

What is crap?

The report.

I have seen a few reports that basically say the same things,

And I rant about them everytime I see them.

is their data invalid?

What is wrong with their conclusions?

They're illogical and founded on some really one sided assumptions.

I would agree that this metric isnt the best to determine the overall security of the products, when tallking in the context of "Holes" and "Viruses" I think the findings fit rather well.

It does no such thing.

Basically (because I'm going to try not to force myself to go through every single advisory and explain that way again):
For Microsoft (MS) advisories they typically only cover Microsoft applications. For RedHat (RH) vulnerabilities they use advisories released by RH.

What's wrong with this, you might ask? Well RH includes a lot of third party software, much of it redundant. How much does an exim advisory affect you if you are using sendmail? It doesn't, there's one extra advisory that does not need to be included in your assessment.

Speaking of third party software, why don't they include all Windows third party software? They include the RH third party software, it only seems fitting that you include all advisories for software that runs on MS Windows. But they don't. This puts a lot more responsibility on RH, and fives MS a lot of breathing room.

Now, of course they cannot exclude third party software, because RH would have maybe 1 advisory per year. They don't write their entire OS distribution like MS does, so that would also be unfair.

What's the solution? A detailed analysis of every advisory. Unfortunately, that doesn't make good crapper material for PHBs, so we end up with one sided poofter articles like this.

So what does this study really tell us? That a company or group can count, and lean everything in the favor of one side before they have started.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Ok, And? Now, say all of your students decided they want to visit www.xxx.live.com and you were told to block it within 2 mintues or your fired. They also asked 4 other admins to do the same thing, A) could you do it? and B) would you all have done it the same way? Keeping things running and managing are two different things.

Say that you have to keep all spyware or adware off of all your windows computers, if a single computer gets infected will you get fired. What would you do if your running windows?

Ah, I think that something like that would be unreasonable.

And yes depending on the Windows vs OS X vs Linux I would of done it the same way. If it was possible, but it wouldn't be because I am not authorized.

You go into the hosts file of the Linux DNS server the school used. Make it when they typed in www.xxx.live.com it would go to nothingness. Give me a break. If I had access to the server, (which a part time temp assistant doesn't have) it would probably take less then 2 minutes. Hell I probably wouldn't even have to walk into the server room, I'd just ssh into the server and be done with it.

In windows, this would have been simple group policy update. Again, I am not denying that these things can be done, I am saying that there doesnt seem to be a single specific way that a new admin could walk in and have a firm grasp on what was going on with only an administrator passoword.

Oh, that's hardcore. And crap.

If your worried about site content you set up a proxy server, and block port 80 from your routers for client machines. OR what? Are you going to go and input every porn site in existance into your group policy? (Maybe you just picked a bad example)

Just because you don't have a clue on how to run a bunch of Linux machines, doesn't mean that other people don't, and that you can't find effect people to do it.

Then you had unqualified admins or damanged equipment. It is as simple as that.

I thought that since Windows is so easy to use that any moron can do it? Sorry, Windows just isn't easy. It realy isn't, it's a hug pain in the rear to keep running, irregardless of what you may or may not think about the difficulty of Unix-type OSes. At least when you set it up you don't have to worry about something spontaniously changing.

The damaged equipment was W2k, and it came out of the box like that. Nobody screwed around with the arp/rarp settings, but yet they got f-ed up anyways.

Ahh, so I was right It isnt broke out of the box, so those "exprianced admins" must have broke it. I am not surprised there are still issues.

No you weren't. Who the f*ck goes around messing with those settings? Nobody, but they still got screwed up. Don't ask me how they did, I am sure that some MS tech somewere could explain what happenned if they knew the details.

Again, ever think it was just a configuration issue? While not 2000, but 2003 it is interesting - http://www.veritest.com/clients/reports/microsoft/ms_netbench.pdf

File serving should most often be limited by your Nic, but one thing that tends "usually" to hold some of the windows benchmarks back in the SMB area is that windows signs each message going over the wire, where as in most of the benches, they didnt turn that off when they compared it to samba. which, of course is not fair.

Hey, I TOLD you it was a configuration issue. Setting up the debian server was part of the troubleshooting.

I never claimed that a Debian server running a 200mhz cpu with a 6gig harddrive was ever going to out perform a new Dell multi-ghz computer running a raid array. The stupid thing's nic probably wasn't even a 100mbit capable one. (don't worry that didn't cause anything. Each computer had it's own port on a fairly intellegent switch)

Look, things just don't work sometimes in Windows. At my current job we have a w2k server, it has automated backups that it does every week. A few times every year it would just fail to run the backup correctly. Why? Nobody knows. You start it and it will just run just fine, and then the next dozen backups will run, then some months later it would just not run the backups. Whatever.

Sure there is, runs a crap load of software, most games, far more hardware and .Net correctly It seems its getting faster too. And, I do you honestly think the stabiltiy card can be played anymore? Seriously?

Sure. Windows appoligists still say that Windows has 10000X times the viruses and worms that Linux does because Windows is 33x more popular. So why can't I say that Linux is more stable?

Is that MSs fault? These things just dont "Install Themselves" - People have to click "Yes" to a big warning or they are downloading crap of KaZaa. Has there been a security hole that a virus has attacked MS software in the past few years that a patch wasnt readily available to the masses for MONTHS? No, there hasnt been. MS made was also the first to auto update itself, however most simply ignored the feature, they didnt want "Their computer reporting back to Redmond!!". You do realize that the number of flaws found in open source software far surpasses flaws MS software both in quantity and sevearity? Simply put, they are 95% of the market, when something comes out that effects it - its going to hurt more than someone with 1% who 9 times out of 10 has a damn firewall infront of his cable modem anyway.

Apperently you don't understand the definition of a worm.

If your running a unpatched windows machine on the internet and your surfing around, you WILL get worms installed on your machine. You WILL get spyware installed on your machine without any user intervention. Nobody ever got "Mydoom would like to install a spam relay on your computer, is that ok? (Y/n)"

Look, I think your a bit wrong about how easy it is to run Windows. Maybe it's because you've probably have been using it for 10 years or so, or maybe it's because your a developer, but it's not easy.

It took me a few weeks of showing how to do things to my parents and family before they became good enough to use anti-spyware software and keep everything up to date and learn how to identify and prevent bad things from being installed on their computer.

Why do you suppose their is a huge sticky thread on how to aviod spyware and crap-whatever-ware at the beginning of software forums? Because Windows is easy?

I truely believe people like windows,

That's because the people that you talk to or are around this stuff truely like windows. It's just your group. I've met people that love windows, but they are at the huge minority. The reality is that most people don't care, they'd be running WinME if their computer came installed with it.

Out of the people that do care, the majority of them dislike Windows generally, but believe that's the normal state of things. That this is "normal" with software, because that's just what they are familar with. A minority "like" windows, just like a minority "like" linux.

Do you think that a OS is that a important fixture of most people's lives that they bother to assign a emotional response to it?
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I just dont see how it would work 100% the same 100% of the time, which is required in this case

If that's what happens chances are the blame is firmly in MS' lap because it's their API they're publishing and any changes or deviations will probably come from their future releases.

MS has done a good job of componentizing past version of windows. And, it works, and doenst seem to cause the stabiltiy issues of past releases. In XP you are aware that you can tell any program to run as if it was in 95. It is part of WoW. But in a place where everything is open source you are right, this is less of an issue, but rarely a bad thing.

Too bad most of the components have cyclical dependencies so that you can't remove any of them without breaking a ton of things.

And, Yes I know about WoW and I also know that in a lot of cases it doesn't work well because chances are if you need to emulate something like Win95 the app will do something stupid like try to access hardware directly, if the app was actually a well behaved Win32 app it would work without WoW.

It was most likely an I/O issue then. VS does a lot of version checks on assemblies and such.

Yes, I know the main bottleneck in VMWare has always been I/O, it's a lot better with 4.x but it's still not up to real-time speeds.

There are plenty of flaws out on both platforms, more so on Linux (as shown in previous post)

Huh? Maybe I missed something, but I didn't see any flaws mentioned in this thread.

but MS has the best track record on fixing them, and ease of finding/downloading/installing them.

Like this: http://secunia.com/advisories/11966/

There used to be a web page of unfixed bugs in IE, but IIRC MS made them take the site down.
 

RyanLM

Member
May 15, 2003
43
0
0
n0cmonkey

I see what your saying, and it is a logical arguement. I would say my response is somewhere in the middle. The study counted not only "Windows" but also MS made server/apps for windows (as far as security holes) and, it took things like RH or Debian, and looked at everything that makes up Linux. You are right in that a lot of things are redundant, and most people wouldnt use half those things. However, the same arguement (not to that scale) could be made if "I dont use SQL Server".

The second part of the report was lag time between the bug, and the fix. Which found that on average MS was far faster at releasing a fix. What is your thought on this finding? Do you feel that it is inaccurate?

However, people tend to act as if MS software, and only MS software has Flaws, and said flaws go unfixed for months. Which, is simply not the case. I would say that the second finding of lag time is more damaging than simply the ammount of Flaws.

Drag

Say that you have to keep all spyware or adware off of all your windows computers, if a single computer gets infected will you get fired. What would you do if your running windows?

Disable ActiveX for starters(on non trusted sites), and prevent users from installing any unapproved programs.


Ah, I think that something like that would be unreasonable.

Not at all.

You go into the hosts file of the Linux DNS server the school used. Make it when they typed in www.xxx.live.com it would go to nothingness. Give me a break. If I had access to the server, (which a part time temp assistant doesn't have) it would probably take less then 2 minutes. Hell I probably wouldn't even have to walk into the server room, I'd just ssh into the server and be done with it.

Interesting solution, but what if you wanted to give half the room access, and the other half not Not important for an XXX site, but maybe for an intranet - secondly, accomplishing that task was only the first part of the issue. You are replaced by someone next year, people are reporting that they can no longer get to several sites (policies changed) - would your change be the first place the new admin would look? Generally, messing up DNS to fix a problem is not the first place I would start to fix one.

Which is the core of my point, this was "Drags Solution". Did it work, sure, quick too. Would anyone else have a clue where you did it unless you told them, maybe.

Just because you don't have a clue on how to run a bunch of Linux machines, doesn't mean that other people don't, and that you can't find effect people to do it.

Again, I wasnt implying it COULDN'T be done. My point was there is no standard way to do such things. You gave me several solutions to the problem, all of which would most likely work, some better than others. I would bet there are others as well, just as in a windows enviroment. However, in windows - most are agreed that everything should be done via a Group Policy. Any new admin can come in and be told "I cant get to this site" and the first place a competent admin should go is into the group policy.

And, this was the first example I could come up with. Most people who tend to defend linux to the day they die never have seen the policy manager, or how it works. So, before I go on, do you have any experiance creating/using group policys, and the level of detail you an control?

I thought that since Windows is so easy to use that any moron can do it? Sorry, Windows just isn't easy. It realy isn't, it's a hug pain in the rear to keep running, irregardless of what you may or may not think about the difficulty of Unix-type OSes. At least when you set it up you don't have to worry about something spontaniously changing.

Windows is as complex as you want to make it, however with the 2000 and 2003 releases, things simply don't change. There isnt any thing I can say or show that can make this point anything other than anecdotal, just as your claim to me is. I dont know what in your experiance has just stopped working.

I personally have been amazed with the roubustness of things such as DFS, Previous Versions, and NTFRS working together. I have a client who has 3 locations, they are an AutoCad shop that does city design. They have 70 gig of data that needed to be available in all 3 locations, always up to date. These locations are only linked by DSL lines (They are cheap too). However, these files are rather large, and simple VPN file sharing is way to slow. The solution was implemented over 1 year ago to this day has not needed any tweaking or support. We setup a server in each location, we setup a DFS (Distributed File System) with a Hub-Spoke topology, which remote links in each location, which made a replica set of all the data in a share at each location. As soon as they hit Save on a file, NTFRS (nt file replication service) starts streaming the changed files to remote locations. What is even more interesting is if one of the DFS Sites is down, Windows (transparatenly) will resort to contacting one of the other replica sets without user interaction (all configurable of course). One plus on this is Previous Versions, a new 2003 feature that makes a hourly (configurable) backup of all changed files on a shared volume. So, if a user deleted a file they shouldnt have, they can right click the folder, open it up as of "yesterday" and retrive it.

But, that is anecdotal evidence. I cant prove its roboustness, I can only share my experiance. I found it rather impressive that a few hours work can do something the company only dreamed of. And it has never failed.

No you weren't. Who the f*ck goes around messing with those settings? Nobody, but they still got screwed up. Don't ask me how they did, I am sure that some MS tech somewere could explain what happenned if they knew the details.

When people start to troubleshoot things they dont know, the registry becomes a fun place to screw things up

Look, things just don't work sometimes in ********.

Fixed that for you

At my current job we have a w2k server, it has automated backups that it does every week. A few times every year it would just fail to run the backup correctly. Why? Nobody knows. You start it and it will just run just fine, and then the next dozen backups will run, then some months later it would just not run the backups. Whatever.

Something tells me you are using Veritas, maybe even 9.0 I have had issues with Veritas (recently) doing things consistantly. However, I also know its not a windows problem, they are separate things. As Computer Associates backup runs just fine everytime.

Sure. Windows appoligists still say that Windows has 10000X times the viruses and worms that Linux does because Windows is 33x more popular. So why can't I say that Linux is more stable?

Because they arnt related. If you had said "Spyware and Viruses arnt much of an issue on Linux" that is a true statement, however it is not a question of stability. I believe it is more a matter of stupidity, and I think we can agree here, Windows has more idiots using it than any other OS.

I get a "my computer has all these weird bars/slow/popup all the time" call weekly. It is never a Linux box, or a Mac. Does this mean windows is less secure? Perhaps, only in that it asks the user to say "Yes or No", and since most users are Administrators, things get installed. And if you wanted to pick out one of the only 2 main security flaws I find in windows, this is one of them. People should NOT be using an administrator account by default, especially one with no forced password. (the second would be allowing access to raw sockets).

That said, it is not an unreasonable claim that since windows is the most used OS it is also the most targeted for crap.

If your running a unpatched windows machine on the internet and your surfing around, you WILL get worms installed on your machine. You WILL get spyware installed on your machine without any user intervention. Nobody ever got "Mydoom would like to install a spam relay on your computer, is that ok? (Y/n)"

You will get hit by some worm yes, if your unpatched/unfirewalled, no disagrement there. Spyware most people get by a some freaking "What Smilies in your email, Click yes!" crap on a website. However, there are flaws in ANY os, people get rooted too you know. Windows is quite simply at a disadvantage because it is the biggest target.

Look, I think your a bit wrong about how easy it is to run Windows. Maybe it's because you've probably have been using it for 10 years or so, or maybe it's because your a developer, but it's not easy.

I was only trying to imply that it is a lot easier to administer a large network of desktops via windows. Most people tend to agree with that, they tend to feel things like collaboration software and true group management are still not ready yet or in some cases dont exist yet.


Why do you suppose their is a huge sticky thread on how to aviod spyware and crap-whatever-ware at the beginning of software forums? Because Windows is easy?

Its not a mater of ease, but stupidity. When Windows is telling you there is a critical update, and you click "No" you deserve all the spyware you get. If you want Comet Curors, you deserve what you get. If you ignore all the prompts to update your computer by going to one stinking website, you deserve what you get. Also there are programs that are "free" but contain GAIN or something of the like. Those are middle ground things, which could happen on any OS if any of them cared to target them. If its embeded in the app you want to use, you are screwed no matter what OS you have.

That's because the people that you talk to or are around this stuff truely like windows. It's just your group. I've met people that love windows, but they are at the huge minority. The reality is that most people don't care, they'd be running WinME if their computer came installed with it.

There are alot of people on both sides, some as you say dont care, others demand XP. I guess, I am seeming more of the "Does this have XP" than does this have "Windows" type questions lately. I have actually gotten 2 "what is longhorns" from end users in the last week. One thing I have seen is a huge change in the amount of MS hate, it has been dropping in my clients. While spyware is their main issue people have most realize it is self inflicted.

Out of the people that do care, the majority of them dislike Windows generally, but believe that's the normal state of things. That this is "normal" with software, because that's just what they are familar with. A minority "like" windows, just like a minority "like" linux.

Could be, I havent found many people outside of a LAN party or a forum that seem to "hate" windows recently. (outside of WinME) But I also havent found anyone that "hates" Linux lately either.

Do you think that a OS is that a important fixture of most people's lives that they bother to assign a emotional response to it?

Only to the bean counters lately

Nothinman

If that's what happens chances are the blame is firmly in MS' lap because it's their API they're publishing and any changes or deviations will probably come from their future releases.

That is totally possible, but to me a hack just to get a few specific features done is equally as possible.


And, Yes I know about WoW and I also know that in a lot of cases it doesn't work well because chances are if you need to emulate something like Win95 the app will do something stupid like try to access hardware directly, if the app was actually a well behaved Win32 app it would work without WoW.

Emulation is never perfect, however I have had more success with using it than most.

Huh? Maybe I missed something, but I didn't see any flaws mentioned in this thread.

It was in a PDF I linked. Not liked to well, but it shows both are not perfect. It also showed that MS is quicker at fixing them (on average) by a good margin.

There used to be a web page of unfixed bugs in IE, but IIRC MS made them take the site down.

Bugs and security issues are different things. However, I am not trying to paint MS as perfect. Are you implying that Linux as of today is "perfect" However, based on history, MS is on average 25 days to fix a published bug, where as it was higher on linux. (Remembering this from memory).
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Interesting solution, but what if you wanted to give half the room access, and the other half not Not important for an XXX site, but maybe for an intranet - secondly, accomplishing that task was only the first part of the issue. You are replaced by someone next year, people are reporting that they can no longer get to several sites (policies changed) - would your change be the first place the new admin would look? Generally, messing up DNS to fix a problem is not the first place I would start to fix one.

Which is the core of my point, this was "Drags Solution". Did it work, sure, quick too. Would anyone else have a clue where you did it unless you told them, maybe.

What? And a admin will automaticly look to a obscure group policy to find out immediately that you have used it to block a porn site? You requirements to get it done in 2 minutes and it worked.

It makes more sense to me to enforce network policies by using the network, rather then a LDAP server like AD.

Now if your requirements were to limit access to certain applications, NOW that would be a decent example. But not intranet or porn sites. To filter out bad content it's much more better to use a proxy server then a bunch of group policies anyday. And for the intranet stuff it would be better to simply use a passwords to protect them.

If that is unacceptable then, f-it. Linux and OS X have LDAP servers, too.

Actually what I like is the X terminals and X server setup. It's much better then depending on network-based authentication. Much easier to administrate and lock down. Having all your business files in one place, instead of buying 300 computers to upgrade, you just upgrade 10-20. Many really positive aspects about it over something like AD. I could go on and on about it.

You can say "well linux is very non-standard, while Windows has standardized interfaces worldwide". Well in my eyes, this is the equivelent of trying to get a "one size fits all", which 95% of the time realy means "one size fits none well".

Yes, in my eyes windows is the sweat pants of the computer industry.

Could be, I havent found many people outside of a LAN party or a forum that seem to "hate" windows recently. (outside of WinME) But I also havent found anyone that "hates" Linux lately either.

Then it doesn't realy matter what OS your using then doesn't it? And I found plenty of professionals to talk about it (never been to a lan party or anything like that even)
It's not that they hated windows. Although it we have had discussions as in "Who would want put up with this crap?". I don't know why, but people do.

Although I've only realy found 1 person that absolutely loved Windows (in public. plenty of fanboys on the internet). A teacher, he would go on and on about how wonderfull MS was. It was funny how he would go on about it.


I personally have been amazed with the roubustness of things such as DFS, Previous Versions, and NTFRS working together. I have a client who has 3 locations, they are an AutoCad shop that does city design. They have 70 gig of data that needed to be available in all 3 locations, always up to date. These locations are only linked by DSL lines (They are cheap too). However, these files are rather large, and simple VPN file sharing is way to slow. The solution was implemented over 1 year ago to this day has not needed any tweaking or support. We setup a server in each location, we setup a DFS (Distributed File System) with a Hub-Spoke topology, which remote links in each location, which made a replica set of all the data in a share at each location. As soon as they hit Save on a file, NTFRS (nt file replication service) starts streaming the changed files to remote locations. What is even more interesting is if one of the DFS Sites is down, Windows (transparatenly) will resort to contacting one of the other replica sets without user interaction (all configurable of course). One plus on this is Previous Versions, a new 2003 feature that makes a hourly (configurable) backup of all changed files on a shared volume. So, if a user deleted a file they shouldnt have, they can right click the folder, open it up as of "yesterday" and retrive it.

Well welcome to the 21st century. I work with a guy that use to do similar stuff. He worked for IBM and the government and used mainframes to manage the payroll. If one site went down (as in nuclear strike), then he would have to have it setup for another site to continue operating with no loss of data and a maximum downtime of 10 minutes.

Oh ya this was in the early 1970's. (pretty old guy. He designed the database system we use himself)

This isn't anything that any other modern OS can't do. Linux has only recently gotten realy good free-software distributed filesystems, but commercial ones have been aviable for some time now. (And their is always rsync. Probably easier to setup then what your describing, and probably more secure, too.)

You will get hit by some worm yes, if your unpatched/unfirewalled, no disagrement there. Spyware most people get by a some freaking "What Smilies in your email, Click yes!" crap on a website. However, there are flaws in ANY os, people get rooted too you know. Windows is quite simply at a disadvantage because it is the biggest target.

My point is that it isn't easy. You need to know what your doing irregardless of the OS in order to be successfull, there is nothing magical that MS does that makes it's OS so much more wonderfull and user friendly then anybody else. It seems easy for you because your so used to how everything is set up.

You blame the user for being stupid, but it doesn't have anything to do with it. Most people just don't know. For example how long have you known what the C: drive is? Pretty simple concept for you, right? Not everybody is going to have a clue what a "drive" is, much less what "C:" is.

Also what you fail to realise is that by the time you start to see all these realy public exploits and stuff, your only seeing the bottom rung of hackers attacking.
Script kiddies and stuff like that. Those are the guys that find out last. Most real professionals can find their own exploits. Many times exploits are being used months in advanced of being discovered by "white hats" and then they still tell MS about it and delay reporting it for weeks. Meanwhile these are being used to crack servers and it's going to be a while before the first viruses or first worms even start showing up. If your a administrator and your having problems containing worms, then a good cracker will be able to walk all over you without you even realising it.

Those studies on time to fix are useless anyways. And enumerating the amount of exploits is BS. That's time to announced to time to fix. Many of these exploits that you are being found are for IE, have been their since the beginning when they released Explorer 5.0. Some exploits in WinXP have been around since the NT 4.0 days.

In those studies a advisory put out by Debian warning of a possible buffer oveflow when Mal-jong game scores are saved is going to get the same weight as a IIS exploit in the wild has been found for a previously unknown vunerability.

It's also customary to let a vendor know about a exploit before announcing it. Many times weeks in advance. I know people bitch and moan about security guys releasing exploit code, but you have to realise that most of the time MS has known about it weeks in advance and has patches in the works.

For example how long have people been using VPN based on the PPtP? That was known to be seriously flawed protocol from almost the get-go. Even with the second revision lots of the problems were never realy addressed by MS. It's only recently that MS has moved onto something bigger and better. (you can thank Cisco for that).

I can pretty much garrentee you that several networks running VPN's based on PPtP have been comprimised because of these well known flaws.

Keeping people from viewing porn sites is one thing, but can you tell me how you would be able to find a kernel mode rootkit program in w2k that intercepts system calls to the kernel to prevent it's detection by scanners, administrators and other means? How would you know you got hacked unless the hacker's installed kernel driver accendently caused a BSOD?

Some parts of Windows, like the UI, are mature. They are easy for most users to deal with and such, but much of the OS is very not mature and not that robust when compared to other OSes, especially for server uses.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: RyanLM
n0cmonkey

I see what your saying, and it is a logical arguement. I would say my response is somewhere in the middle. The study counted not only "Windows" but also MS made server/apps for windows (as far as security holes) and, it took things like RH or Debian, and looked at everything that makes up Linux. You are right in that a lot of things are redundant, and most people wouldnt use half those things. However, the same arguement (not to that scale) could be made if "I dont use SQL Server".

That's why studies like that are just total crap. They're going to be biased, unless you dig down deep.

Another way that might be a bit better, is to study it based on server usage. Compile a list of common applications for each use (webserver, database server, email server, etc), and count up vulnerabilities there. Still not great, but a bit closer to how it should be, IMO.

Common daemons for a *nix webserver:
Apache
*ssh

Other software you might take into account:
zlib
OpenSSL
libc/glibc
php
etc, etc, etc

It should be similar for a mailserver, or a database, or a dns server, etc. Take a set of features (example above is http service and remote administration), and base the study on that.

Like I said, not perfect, but a little better, IMO.

Another way would be to take a full operating system, and compare it to Microsoft, without including third party software on either end. OpenBSD includes just about everything you need for a basic server. Apache, bind, sendmail, bgpd, etc are all in base. They're all maintained to varying degrees by the OpenBSD developers (apache and bgpd are fully maintained, sendmail and bind are more closely based on the official trees).

The second part of the report was lag time between the bug, and the fix. Which found that on average MS was far faster at releasing a fix. What is your thought on this finding? Do you feel that it is inaccurate?

I didn't get that far. Those studies enrage almost as much as the local newspapers (my county spent ~$10,000 on a friggin sign!). But generally I'd say it isn't true. The unpatched IE list was some proof that MS has been known to ignore security problems.

Any group will have issues with releasing patches as quickly as everyone would like. There were complaints about how long a patch for the big hole in OpenSSH took to come out a couple of years back, but there was a work around in place.

For vulnerabilities that are taken care of properly, there shouldn't be any excuse to release a patch in a timely manner. A couple of weeks isn't a big deal if the vulnerability hasn't been announced. But, for something that has been released into the wild, an immediate patch is almost necessary. And recalling one of those necessary patches, just ain't right.

Anyhow, I think both sides of the coin can improve what they are doing, but the track records aren't as bad as some people think.

However, people tend to act as if MS software, and only MS software has Flaws, and said flaws go unfixed for months. Which, is simply not the case. I would say that the second finding of lag time is more damaging than simply the ammount of Flaws.

Every piece of software has holes. MS has been reluctant to fix some of theirs in the past, but that's typical of a large corporation.

Guess I should go and try to read the other half of that article.

EDIT: I wonder how they tracked all of those days. Did the person that found the bug CC them on the emails to the developers? Debian is known to drag things on a while for testing. The others have no excuse. Typically when I hear about a fix from a developer (say sendmail), I see a patch for it on the OpenBSD site within 48 hours, typically closer to 24.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
That is totally possible, but to me a hack just to get a few specific features done is equally as possible.

Maybe in the short term, but with Novell/Ximian pushing this too I'd bet it will be feature complete if for no other reason than to say it is in their advertisements.

Are you implying that Linux as of today is "perfect"

Of course not, but I am saying that, for me, it's magnitudes better than Windows probably ever will be.

It was in a PDF I linked. Not liked to well, but it shows both are not perfect. It also showed that MS is quicker at fixing them (on average) by a good margin.

Oh, you mean the one n0c said was ass? Well IME all I can say is that any time I run into a bug in Debian it usually takes about a day to for the fix to make it into the mirrors and probbaly 90% of the time I can fix it myself from the details in the bug report or just roll back to the previous version of the package for a day or so until the maintainer fixes the problem.
 

RyanLM

Member
May 15, 2003
43
0
0
What? And a admin will automaticly look to a obscure group policy to find out immediately that you have used it to block a porn site? You requirements to get it done in 2 minutes and it worked.

Yes, they would. That is where you look for everything - in the policy. And yes your solution did work, it also had the unrequired aspect of blocking anyone (not just the ones asked). Either way my question was more on the uniformity side.

It makes more sense to me to enforce network policies by using the network, rather then a LDAP server like AD.

Group Policies are all inclusive, generally. You can think of them as a users bill of rights at their end.

Now if your requirements were to limit access to certain applications, NOW that would be a decent example. But not intranet or porn sites. To filter out bad content it's much more better to use a proxy server then a bunch of group policies anyday. And for the intranet stuff it would be better to simply use a passwords to protect them.

They are both good examples. The only reason I would ever use a Proxy is if the need for caching came up. In general they hurt more than they help. You dont "Add a bunch of Policies" You edit one. It makes perfect sense to block sites through GP - It allows you fine control over just what you want sites to be able to do. ActiveX? Java? JavaScript? You can fine tune levels. and put sites into one of these levels for access, or just block them completely.

If that is unacceptable then, f-it. Linux and OS X have LDAP servers, too.

Yes, they do, however they dont compare to AD.

Actually what I like is the X terminals and X server setup. It's much better then depending on network-based authentication. Much easier to administrate and lock down. Having all your business files in one place, instead of buying 300 computers to upgrade, you just upgrade 10-20. Many really positive aspects about it over something like AD. I could go on and on about it.

Terminal based solutions are very good, I run several my self either Windows Terminal Services with or without Citrix. However AD and GP are still used on a daily bases as when you have 1000 people connecting to a few boxes, they all need access and rights to do different things. Because of this control there is a reason Win Terminal/Citrix are used in most of the top companies.

You can say "well linux is very non-standard, while Windows has standardized interfaces worldwide". Well in my eyes, this is the equivelent of trying to get a "one size fits all", which 95% of the time realy means "one size fits none well".

Again, have you seen AD and GP? You do know you can extend them at your will? 99% of all windows administration tools plug into the MMC (Microsoft Management Console) Every tool you plug in has a familiar interface and a consistant feel. If you want to edit IIS, GPs, New Users, or DHCP - it i

s plugable. I dont see a problem with uniformity to prevent confusion.

Well welcome to the 21st century. I work with a guy that use to do similar stuff. He worked for IBM and the government and used mainframes to manage the payroll. If one site went down (as in nuclear strike), then he would have to have it setup for another site to continue operating with no loss of data and a maximum downtime of 10 minutes.

The argument was about robustness of the product, not that all of this was a "New Thing". I would also bet your IBM Solution cost far more, and took far more time, and didnt use freaking DSL. Sure, I could also use EMCs Geo network to do the same thing. Granted, I would need some beefy lines to make that all work. Either way its not going to be setup in an afternoon.

This isn't anything that any other modern OS can't do. Linux has only recently gotten realy good free-software distributed filesystems, but commercial ones have been aviable for some time now. (And their is always rsync. Probably easier to setup then what your describing, and probably more secure, too.)

Oh? Show me something that handles all three things - Distributed File system that automaticlaly selects the quickest path to the data, Instant Replication that can be Hub-Spoke or Mesh topology with line weighting, and anything similar to Shadow Copies (Previous Version) that adds Icing on the cake. It would be nice for this all to be in one single product, no 3rd party apps, but I wont limit you there. On the windows side this would cost 3 licenses of Windows, and about 1 hour to set everything ready.

Please, rsync doenst even come close, it is comparable to RoboCopy in windows. By just looking at their examples, Easier doesnt even enter the equation

My point is that it isn't easy. You need to know what your doing irregardless of the OS in order to be successfull, there is nothing magical that MS does that makes it's OS so much more wonderfull and user friendly then anybody else. It seems easy for you because your so used to how everything is set up.

You honestly believe windows isnt any easier for an end user than windows?

You blame the user for being stupid, but it doesn't have anything to do with it. Most people just don't know. For example how long have you known what the C: drive is? Pretty simple concept for you, right? Not everybody is going to have a clue what a "drive" is, much less what "C:" is.

No OS is potected from the someone who doesnt have a clue, that is my point. Windows has the most of those uses.

Also what you fail to realise is that by the time you start to see all these realy public exploits and stuff, your only seeing the bottom rung of hackers attacking. Script kiddies and stuff like that. Those are the guys that find out last. Most real professionals can find their own exploits. Many times exploits are being used months in advanced of being discovered by "white hats" and then they still tell MS about it and delay reporting it for weeks. Meanwhile these are being used to crack servers and it's going to be a while before the first viruses or first worms even start showing up. If your a administrator and your having problems containing worms, then a good cracker will be able to walk all over you without you even realising it.

I dont disagree with that, however this isnt an MS only issue. This would effect both sides of the fence in the same way. However, MS seems to be faster at fixing its bugs on average.

Those studies on time to fix are useless anyways. And enumerating the amount of exploits is BS. That's time to announced to time to fix. Many of these exploits that you are being found are for IE, have been their since the beginning when they released Explorer 5.0. Some exploits in WinXP have been around since the NT 4.0 days.

Those studies look at the last year, what came out, and what was done with them. I am not going to argue that windows is perfect, however those remainging bugs that are still in there certainly arnt causing much of an issue. Perhapse they were fixed in some other fashion.

In those studies a advisory put out by Debian warning of a possible buffer oveflow when Mal-jong game scores are saved is going to get the same weight as a IIS exploit in the wild has been found for a previously unknown vunerability.

I dont think they weighted them, I think they were predefiend (as most do, MS will say this is "critical" etc). I could be wrong, the actual report cost a grand, and I am not going to pay for that for a forum debate

But, granted, IIS is a server app, and that is a game. You just better not play that game

For example how long have people been using VPN based on the PPtP? That was known to be seriously flawed protocol from almost the get-go. Even with the second revision lots of the problems were never realy addressed by MS. It's only recently that MS has moved onto something bigger and better. (you can thank Cisco for that).

I wouldnt deny MS likes to do its own thing for a while, and yes they can be resitant to change. I remember reading the articles about it from MS. I remember the comment if best practicies were followed it shouldnt be an issue. But that is only a half answer. Are you contending that Linux has never had a bug go un fixed for some time?

Keeping people from viewing porn sites is one thing, but can you tell me how you would be able to find a kernel mode rootkit program in w2k that intercepts system calls to the kernel to prevent it's detection by scanners, administrators and other means? How would you know you got hacked unless the hacker's installed kernel driver accendently caused a BSOD?

They are a not easy to find. There a few tools, however in a case like this prevention is key.

Some parts of Windows, like the UI, are mature. They are easy for most users to deal with and such, but much of the OS is very not mature and not that robust when compared to other OSes, especially for server uses.

I would say that the vast majority of it is mature and more feature filled. But there is no point argueing this point with you - we wont be able to convince eachother.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |