How To Make All Front Ends Work On Https?

As Google now lets stores with full https rank higher than stores on http its important to have all storefronts on https.

But how to do this for a multisite setup where different front end domains should have different SSL certificates?

You get to buy additional certificates from other big-players like Symantec, etc. It's all a racket. The time is not far out where these guys are going to get a huge anti-trust suit against them for collusion in the market.



Alternatively, if they are sub-domains, you can buy a wildcard certificate (several times the cost of a single site).



I'm not sure I'd overreact to this. Google has been known to change their mind on half-baked decisions like this one once they really realize the ramifications. Like what happens once you double the bandwidth requirements? Think about the impact to things like video which barely makes it with http and would die under https.



And the impact to the already taxed mobile devices…

My customers also prefer a website fully under SSL. Its worth the effort. If only for a sense of trust in the store.

If I add a ssl certificate per storefront, then will this work with images all being pulled from the main domain?

Google isnt saying every page needs to be served https,(SURELY???) just that the site needs to be "https able "as this proves accreditation of being a secure/ proven body of traceabilty. Thus, full green bar ev and that will probably override simple self cert…or have I mis understood?.

All will become clearer I'm sure, maybe when google decides to sell ssl certs???



40 billion websites, = ssl 10 google dollar each! but … if you buy via google+ today we have discount…ha ha I can hear it now.



seriously though

There are billions of sites out there with great information that dont care a jot about numbers of hits, they just have correct and concise info about their individual expertise to get them ranked no 1, 2 , 3 etc…so why would they want/need to be https, if they dont sell a procuct or store consumer info?

Is wikipedia gonna need to be https to be listed, is amazon gonna serve every page as https? ebay? about.com ? blah blah…



Not gonna happen, and when it does…Bing/Yahoo or god forbid someone like “lycos” (remember lycos…woof) ha ha comes along and gains ground?



STOP…news just in…Google has just bought Symantec, Digicert, SSL store, Geotrust and ALL ssl cert companies in the world !



https my arse.



IF YOU BUILD IT…they will come!! [attachment=8342:if-you-build-it-they-will-come.png]















John

if-you-build-it-they-will-come.png

Google is saying that they will rank https pages higher than http. i.e. all your pages need https if you want those to have optimal ranking.

This has been coming for a long time now. https is also a requirement of Google's SPDY protocol which relates to site speed, which is also a ranking signal.

Where does Google rank https vs site speed? All pages using https slows down the site (according to several posts here). So, is a slower site with ALL pages https better than a faster site without all pages https?

I don't plan on having all my pages https, just the ones that need to be. I think my customers would rather see a faster site, then knowing that the site meets Google requirements.

Would be interested in knowing.

Thanks,

Bo

Using the SPDY protocol is one way to speed up your site. SPDY requires https.



I run very large and active websites and have recently moved these to complete https without a significant performance hit. In fact they seem a tad faster now that all content is served from https instead of just some content. But modern servers have functionality that caters to ssl.

I don't know if the CS-Cart software itself has performance issues with https. I have no experience with that.

A page served via https and compressed will be about double (sometimes more, sometimes less) in size than a non https page. Hence you transmit more data.



How are you running SPDY? It requires a SPDY web-server and a SPDY enabled browser (replacement for http and https).



When you encrypt a page, you run a mathematical calc on the data that generates a binary result. That binary result has a character set that is much larger than the original and contains many fewer repeating characters. It's this second part that makes it more expensive to compress and compression ratios are very low.



It is valuable if you are sending sensitive information, but does nothing but add complexity and resource cost if you use it generally.



Note too that https data (images and content) cannot (by definition) be cached in the browser. Hence each page will serve all of it's resources.



There is no way it can be faster.



These discussions are fun…

Sorry, I think I skipped a couple of responses above so will try to respond here.



A client (browser) has no way of knowing whether a site is https capable or not without initiating an https request. There are basislly 2 types of cert levels returned. 1) Standard https where the Organizationl Unit has been verified by getting authorization from the registrant before the certificate is issued. 2) a fully qualified ssl (one that generates a full green bar) that does additional verification of the business. before the cert is issued. It usualy takes several days to a couple of weeks to get #2 certification and is rather expensive. The actual encryption of the data is still the same, it is just the validation of the “name” of the business that the additional certification verifies.

You are right that encrypted files are more than twice the size unencrypted files have. This can pose an issue. Fortunately there are several ways to mitagate this and to come to an improvement of speed and serve a fully encrypted website. A good configuration and reduction of handshakes is one important thing.

Using both https and http on the same domain can cause issues. For example if the browser is first looking up the https version of the file and then gets redirected to the http version then it can take more time because of the redirection taking place.



Another issue is that after the efforts of the EFF and Snowden many people run 'http everywhere' addons that lookup and redirect to the https version of a page. If the website is set to redirect to http then this creates an infinite loop which is not very different from a DDoS attack.



I am not running SPDY yet, but will soon adopt it when LiteSpeed WebServer 5.0 goes gold. LSWS 5 is currently in RC1. LSWS does wonders for caching and speeding up database driven php websites. Its a difference of night and day. The litespeed devs are also working on implementing Facebook's HHVM protocol which will make php 9 times faster.

SPDY is supported by a wide spectrum of server software. Most browsers have implemented SPDY support some years ago.



I highly recommend this article: https://thethemefoun…a-cdn-spdy-ssl/

[quote]These results blew my mind. We are seeing the great SSL negotiation times. Then, for items 2-15 (our assets), there is no additional SSL negotiation or connection times. SPDY’s HTTP multiplexing is allowing all of these request to occur over the same TCP connection. We do not need to make additional connections or SSL negotiations. All of the items on the page are loaded in just over 900ms, which is outstanding (yes, that’s faster than the old site, but the server is also closer).[/quote]



I agree that the whole concept of using ssl for images etc can be pretty useless. Why encrypt a button for example. But this is the way the web is going. There's no way around it. When you are running a store then trust is crucial. So if full implementation of SSL gives the customer a sense of trust while not implementing this causes trust & reputation issues then the issue becomes a no brainer to me.

It's no different from implementing a trusted stores or hacker safe script/logo.



I agree that these discussions are fun, as its good to consider all aspects of this and you bring good arguments to the table.

well, the most effecitve solution would be to replace TCPIP. When I worked at Tektronix in the 80’s and we were building a competitive product with Sun, we got 20X the network performance by moving from TCPIP. TCP was built for unreliable networks. There is so much overhead to ensuring that the data received is received in the right order and has not been corrupted along the way that the overhead is as costly as the payload.



What Tektronix learned was that even though you may have a technically superior product, that trying to change the world is nearly impossible. How long has IPV6 been a standard? How many sites utilize it as the primary address method? And 32bit IP addresses are definitely in short supply.



I agree on the confidence side. A customer doesn’t care what it costs for you to deliver a purchase to them. But they expect costs to remain the same or to decline. They are generally unrealistic. But I guess my main point is that you can use secure connections for sensitive information while maintaining speed and performance for the 99% of content that is insensitive. In the ecommerce world, we are talking about product data. We are not dealing with discussions that usually evolve into socio-political sensitivities that might make various intelligence agencies interested in our conversation. If we were to have those discussions, then yes, I would want mine secured because a little bit of bad information can go a long ways toward a wrong assumption. But do I need a description of my frying pan to be encrypted? :-)

Well, if you order some backpacks with those pressure cookers, it can lead to a visit by the anti-terrorism task force.

http://www.thewire.com/national/2013/08/government-knocking-doors-because-google-searches/67864/

After all the discussion the questions was still not answered… Can we do this and how on Cs-Cart 4.2.1 ??

Sorry, mixing up threads… Deleted original post.

Assume you will need to use a .htaccess rewrite rule to make any http access redirect to https.

Normally I would just use this:


RewriteEngine on
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]


But what I wonder about is the way CSC redirects from storefront to backend. I guess I will just test it out as soon as my new certificate is ready.

Anyone figure out how to make this all work in V4.2.x… We still cannot. Sure we can get the pages to load https yet the SEO stuff doesn't work… Submitted a request for them to update the knowledge base but who knows when that will happen…



http://kb.cs-cart.co…tps-whole-store

as you can see there is SEO changes that need attention in V 3… Assume there is something in V4.2.x that needs changed also.



Come On Cs-Cart… We need a little faster responses. Those of us who are trying to stay ahead of the game need the support that is expected.

I hope Google abandons this idea, but I'm afraid they are not going to. Right now, they are only giving a slight ranking edge to https pages but the day may come when the edge is not slight any more. There may come a day when sites not running on https are simply dropped from the search results. What concerns me most is that if they really implement this, lots of pages are going to lose their ranking if we don't 301 redirect them. All incoming links will have to either have 301s or be changed…unless Google creates a way for a https page to keep the ranking of any http page of the same name… Being a small online merchant just keeps getting harder and harder and at times makes me wonder if it's worth the effort.

@kingsley - you are right. We've given the power to Google to define the net for us. They are the wolf in sheep's clothing. They started by fighting Microsoft over many dictatorial policies/practices. Now that they've gotten traction, they're doing the same. Just like in politics, once someone has the power, they forget what got them there to begin with.

Question… Once we change over to a complete https:// site and we are redirecting in such a way…

Will Google see these as entirely new links and we'll lose all our old ranking status for each url… ???